risingthumb.xyz Bringing the pain to an already pained web

Hero/Villain Dynamics in the 21st Century

On the 14th of september I observed a data leak of the Epik Domain Registrar. What makes this leak interesting to me, is the reason given. First I will talk on the point of Security though...

OperationEpikFail 600 800

HOW DO YOU FUCK UP SECURITY SO BADLY? Unhashed plaintext passwords are not a standard. It is absolutely terrible OPSEC. There were a few which are hashed in MD5 which is outdated and lots of rainbow tables are available for reversing that. Additionally, the entire /home/ and /root/ directories of one of their core servers are available, along with a lot of sensitive information. Fortunately credit card details aren't included among those details, but since they OPSEC was so poor, I would not be too surprised if these hacktivists responsible did leak those credit card details later on.

So onto the reasoning given. The CEO of the Company, Rob Monster, a Christian Libertarian is likened to a Nazi in the included textfile for the leak. A Christian Libertarian is about as far from a Nazi as you can get aside from their relatively traditionalist religious beliefs. Additionally, the leak seems to imply that those who use Epik as a registrar are associated with the alt-right or conspiracy theorists. I disagree. Epik has simply stood their ground in not removing domains that do not violate their terms of service over the whims of a mob. As such, this provides a lot of interest to those who dislike the tyranny of the masses.

On the topic of standing their ground, Epik refused to take down 8chan until threatened by a company called Voxitility to remove technical support to Epik for their stance. As a result, Epik was forced to take 8chan down, however the lengths to which they stood for the essential freedoms of democracy, namely freedom of expression, speech and belief make it a very nice domain to be on. Unfortunately, their poor security puts me on the look for another politically neutral domain registrar to use.

So why do I bring up the Villain and Hero Dynamics in the title? Because I think this is a case of those dynamics at play. By the hacktivists lumping all users of Epik into the Villain group of the Alt-Right, they have effectively removed all substance. Due to the past nature of Hacktivists and Anonymous they take on the mantle of "Hero", as they were certainly looked up to by impressionable people. As such by leaking the data on political arguments that don't hold up to scrutiny. If anything, they are villains in their own right by inciting doxxing of people, which is a criminal offence in some nations(HK, China) and is plainly seen as bad as the next logical step is violence and abuse of these personal informations.

I question if this is actually "Anonymous" as they have been long since dead for a decade, ever since the phone became mainstream. It is also because this action plays exactly towards giving more control and information for political vigilantes and zealots, which frequently justifies the removal of freedoms as observed by many terrorist actions. It could fairly be argued that this is a CIA or FBI operation to push towards increasingly polarised individuals, under threat of censorship by technocrats... though why a group funded by the tax dollars of a democracy is pushing for more technocratic control is weird. Perhaps a work-around. They can't be authoritarian, but they can be authority over the technocrats. This is all theorising of course, I don't see any evidence of any form other than the circumstantial claim of "Anonymous" which is often only invoked when politically convenient these days.

It's also odd, because this strikes a parallel with what I have been seeing more and more in the Western Political hemisphere. This "we vs them" attitude. We are the hero of our view of the world, they are the villain of this world. It is an incredibly naive outlook on our world, as different systems operate better for different people. This can be plainly seen in the Philippines where the West cherry pick Duterte's speeches to demonise him, while Filipinos continue to support him regardless of the Western outlook because he tackles issues present in SEA, especially given the rising tensions between China and other South-East Asian Countries. Americans find that Democracy works best, and a number of the socialist countries(which do not label themselves as such to avoid association with Communists).

This outlook is problematic and dangerous, and too many people believe that a differing opinion renders you a villain and in absolute opposition. This is the opposite of "tolerance". It makes me truly and utterly angry and furious that such a naive outlook of the world is acted upon in activism to the complete and utter neglect of neutral third parties to the point that those third parties are incorrectly associated with beliefs they do not stand for and are wronged over and over.

What's worse, is there are a myriad of valid and understandable reasons. Financial Incentive? Understood. Despise the CEO or the Company? Understandable, if justified further. Despise Liberty, freedom and any values of Democracy? Understood. Truly want the neutral third parties to suffer? Understandable, if odd and pushing those third parties to fight you.

Anyway, if people reading this are aware of any Politically Neutral and Secure domain registrars, please do let me know. I would like to improve my Operational Security. For now I stand on Epik. The consequence of being possibly doxxed, is acceptable if a little disturbing, for the gain of my essential freedoms being upheld.

=> Link to post, and comments here

Immutable data being used for mass murder at scale

If we've been looking at the pattern of data misuse, this shouldn't really come as a surprise as dark and sad as it is. The worst aspect is that this regards immutable biometric data. What has happened is that amidst the US Retreat from Afghanistan, a lot of biometric data[1] has been left in the hands of the Taliban, so they can easily identify, target and cull, kill or torture any people who were allied with the Americans[2]. Although it's a plain betrayal of the Americans to their allies. It's also a plain example of bad info-sec.

I mentioned before that it's a pattern. Originally in Nazi Germany, census information was used for rounding up Jews. The issue here is it's not particularly immutable, and not perfect targeting information. This is akin to using crime statistics of different races and locations to position and prepare Police in various areas of a city for most effectiveness(pragmatic, even if not uniform or particulary fair). The issue expands, when one sees that corporations sell data for a profit, and this data is used for murder[3]. This issue includes anonymised data as seen rather recently by Christian Gayhunters hunting Gay Christians[4] in their midst, and finding data that points to a bishop being gay(circumstantially of course. I didn't look further to see if they had direct evidence to substantiate their claims).

Although in the case for the Afghans, this immutable data was likely given for their Visa program(which the Americans clearly didn't follow through on) although this is just conjecture as I don't know the details. What's more pressing is when this will be applied to commercially obtained biometric data. I do expect the next logical step in data abuses to be from commercially sold biometric data, or "legally" obtained by intelligence agencies in their country. Commercially sold? Yes. People go out of their way to willingly provide their identifying data simply to know their ancestry... except it's not really their ancestry. It's just a pie chart of percentages showing how much race mixing has happened(in a quite literal sense here. It gives a percentage of how much you originate from some country and how much with another country).

It's ultimately rather demoralising to see another in a series of data abuses, especially as a lot of it is avoidable.

=> [1] Taliban likely have biometric data
=> [2] Betrayal by US forces to their afghan allies.
=> [3] The corporate surveillance machine is killing people
=> [4] Bishop caught for being gay via anonymised data
=> Link to post, and comments here

Academic Science and its problems

Let us begin this by refreshing what is the scientific method. The scientific method is where one makes an observation or question. Follows this with research into the topic. Following this with a hypothesis, and following that with an experiment to test the hypothesis. Data is then analyzed and the conclusions are then reported. The value of the conclusions is based on whether the data and the conclusions made from it can be replicated.

The first issue here is replicability. Somewhere in the region of ~70%[1] of all studies made are simply not replicated. This 70% covers the set of both, attempted replications where the replication doesn't find the same result as well as simply not attempting them. Then there are a lot of studies which simply aren't replicated. This alone is pretty damning. Let us ask then, what is needed for a result to be deemed significant? Statistically, the result must be incredibly unlikely. This is referred to as a P value. However P Values can be hacked by a method called "P-Hacking"[2]. A number of methods exist, the most common is multiplying the number of dependent variables you measure with a small sample size(this means one of the values are, by coincidence likely to produce a significant result that is down to coincidence).

The reason why P-hacking is done is pretty obvious. Scientific journals want to publish significant results, not replication results or insignificant results. They also want to publish novel research. As a result, there is little funding in replication(cherry picked replication as was done by cigarette companies is one avenue for funding), also because a lot of journals simply do not accept replication papers.

This first problem is called the reproducibility problem. It's such a big problem that over the last decade scientists have been trying to tackle it. The point of reproducing results is because P-Hacking can be done without malicious intent(increasing dependent variables is rarely malicious. Similarly, a small sample size is often a pragmatic problem of not having the means for a larger one). If you cannot reproduce the results, the P-Hacked results aren't in fact significant, as by Occam's razor it is much more likely coincidental.

The second issue that's arising is a case of paper quality. The paper on tortured phrases is an example[3]. This is mainly a problem of padding papers, and being relatively deceptive. This indicative of a poor peer review process regarding scientific journals. Despite this, the solution is self-evident. Improve the peer review process, but with such quantity of papers it's not the easiest. This problem will only get worse as AI(the tortured phrase of AI is Counterfeit Consciousness :^)) advances with regards to creating sensible papers. This will get worse without question due to the presence of GPT-3 produced by OpenAI(deceptively named, as it's not Open at all) being able to produce coherent fictions. Another issue presented is simply citing non-existent scientific papers. I only see a solution where it is tied to some technology like Google Scholar being able to tackle this, but even that would be flawed due to the rate of link rot being dangerously fast for the 21st century(to the point, the internet could be called a dementiated brain of knowledge).

Presented are two major issues in academic science today. It is the reason that a lot of papers aren't replicated, are trash. Those which aren't replicated are dubious until replicated(and even replications can have issues). The presence of external factors muddy the waters. People want to get good degrees, acquire more research funding and do decent science... but all this is at the cost of the integrity of the scientific world.

As a side note, I should mention the scientific cult. These are people who regard an unreplicated study to be fact and set in stone, and will defer to them for the more extreme arguments made. This effects even scientific citation as papers less likely to be true are cited more[4]. In fact, people with an overbearing dependency upon scientific studies that haven't been replicated often have an agenda.

As a result of this, the papers I would typically find more trustworthy are those by engineering companies, as quite often they are written with an intent of setting forward a new technology and convincing people it should be used more widely. An example of this would be the Valve paper on Signed Distance Functions[5]. This is because the claims are much more likely to be refuted and challenged by other engineering companies, and by people involved in the scientific world as these are typically well regarded and well known companies who can make their research well known(The darker side naturally, is when they use this to push an agenda as cigarette companies have done).

=> [1] Nature article making a study with a sample size of 1,500 Scientists.
=> [2] Why most published research findings are false
=> [3] Tortured phrases: A dubious writing style emerging in science. Evidence of critical issues affecting established journals
=> [4] Nonreplicable publications are cited more than replicable ones
=> [5] Improved Alpha-Tested Magnification for Vector Textures and Special Effects
=> Link to post, and comments here

Printing with netcat

I recently discovered thanks to this blog[1] that you can print with netcat. This was impressive to me, but even still, it doesn't completely solve the problem. Instead it moves it from handling the data for printing on the client's side to handling it on the printer's side.

These issues will arise when you try to print a pdf(pdf is a particularly complex file format. More explanation and discussion can be found here[2]). As such, because there's a long precedent in supporting it, it's best to conver a PDF to Postscript(otherwise the printer will spend a long time processing the file, or just fail because like I said, it's a complex file format).

The command for printing if people are interested is `nc printerIP 9100 < document.ps`. Suffice to say, this was a very interesting read. Thanks Retrohacker!

P.S. In the example provided, where pdf is the proposed alternative, I disagree. I think Gemini is the current viable alternative. I don't think it's in a position where network effects multiply, so the only adopters are technology enthusiasts(and even then it's relatively close in structure to HTML in the mid 90s).

=> [1] Retrohacker, bye cups, printing with netcat
=> [2] Lab6.com, pdf as an alternative to the HTML standard
=> [2] Gemini mirror of above.
=> Link to post, and comments here

License Laundering

Extend, Embrace, Extinguish.

This is a quote oft. used in reference to Microsoft. Yes, that Microsoft, the producers of such bad software as Windows 11, Windows 10, Windows 8, Windows 7, Microsoft Teams, Skype and owner of Mojang, Developer for Minecraft, and owner of Github.

It has been for some time that Microsoft has been suspiciously buddy-buddy with Open Source Software. To the point it was acceptable, but as usual you cannot trust a corporation with anything you value. Please note, I refer to Open Source Software which is typically exploited by Corporations and typically use poor licenses with no protections such as the MIT and BSD Licenses, that simply request the license reproduced verbatim. Free Software on the otherhand places value in Copyleft ideals- turning Copyright on its head and demanding the work be produced in public if the GPL is used.

So recently, Github announced a project they had been working on. Github CoPilot. Initially I paid it little attention, "Huh, it's autocomplete for Code? That's neat, and will render a lot of the easily and mass produced work of web developers redundant". Dig a little deeper, and you'll see this is a big turning point for Copyleft.

Why is it a big turning point? The Data that Github used is all[1] public repositories. This is ignoring licenses. For some Licenses, it is easier to satisfy, like the MIT License or BSD License, but others by their nature is very hard if impossible to be satisfied by a corporation. Such licenses include the GPLv3, GPLv2, AGPL and Apache Licenses, as they all have conditional use. A study was produced on this by Github(naturally this produces a possibility of bias)[2]. What strikes me as interesting is this Quote taken from the results.

"For most of GitHub Copilot's suggestions, our automatic filter didn’t find any significant overlap with the code used for training. But it did bring 473 cases to our attention. Removing the first bucket (cases that look very similar to other cases) left me with 185 suggestions. Of these, 144 got sorted out in buckets 2 - 4. This left 41 cases in the last bucket, the “recitations”, in the meaning of the term I have in mind."
"That corresponds to 1 recitation event every 10 user weeks (95% confidence interval: 7 - 13 weeks, using a Poisson test)."

This presents two things. Firstly it does recite the training data frequently enough that this the legal case must be brought up. Secondly, the recitations are on all the various licenses, even those that can't be licensed.

An additional note, not mentioned in Github's study is the possibility of both misattribution and incorrect Licensing[3]. This means that copyright infringement will occur, and even when the CoPilot produces a license that can be satisfied, you have no guarantee that it *actually* is satisfied. The weight of copyright infringement lies additionally with the developer too. This is a tool provided, it is up to the developer to properly license their work.

As such, the legal problem I will lay out is this. GPL License must be applied if a GPL-Licensed work is modified. Github CoPilot produces work with the GPL Licensed work in the data set. As such, the first argument is that any data set with GPL-Licensed work in it, is both, using it and modifying it. The second argument is that, if Github CoPilot produces a work that is a verbatim copy of another licensed work, is it subject to that licensed work's conditions?

The latter point is complicated by the presence of "Generic Solutions", this typically occurs within algorithm design, as a solution is generic and already coined, for example the Dijkstra's algorithm. Even a verbatim copy of code is very unlikely to hold up with copyright due to the frequency that problem crops up and the number of optimised solutions and implementations for it.

The reason this is a particularly noteworthy event is because there is no prior legal precedence. So the result of this, will be the precedent for furture similar cases. If it falls out of favour, a term called "Licensing Laundering" where GPL Licensed work could be laundered by the use of a tool like Google CoPilot to be a license compatible and usable for use in industry. This defeats the virility of GPLv3 and AGPL, and makes necessary a new License to handle this new corner case, and renders all prior code under GPLv3 and AGPL possible to launder and use by big corporations. If it holds that Github is in breach of copyright on many licenses, then there is nothing to worry about.

As side points, I feel I should mention that the GPT-3 which is the AI produced by OpenAI which is not Open, is effectively purchased and owned by Microsoft[4].This adds some credibility to the idea that this was the extend step. The embrace step was to purchase Github and with it come to acquire all the publicly available and licensable code. They then extend by purchasing exclusive license of GPT-3 by the For-Profit and non-open company OpenAI[5]. The extinguish step is to extinguish restrictive licenses and come into possession of a great quantity of code that is as free as public domain. In effect, getting a huge amount of free code. It also extinguishes bad programmers, but to me that is no loss at all.

Extend... Embrace... Extinguish. As one could probably expect, I am personally against this form of License Laundering, as it would have disasterous effects on Free Software. But lets not kid ourselves, the problem of Machine Learning algorithms overfitting to their training data was eventually going to crop up sometime... I suppose now.

=> [1] An Email between and unknown person and Github Support backs the claim all public repositories are used regardless of license.
=> [2] Research Recitation on Github's Copilot.
=> [3] Github copilot reproduces verbatim the fast inverse square root and incorrectly licenses it.
=> [4] Microsoft team up with OpenAI to exclusively license GPT-3 Language Model
=> [5] OpenAI is a for-profit company, that used to be a non-profit.
=> Link to post, and comments here

Who games on Linux? #4 Sonic Robo Blast 2 Kart

Game Info

- Name: Sonic Robo Blast 2 Kart(Abbreviated SRB2)

- Genre: Multiplayer Racing

- Demo: N/A. Free game

- Linux: Exists, but support will vary

Thoughts

It's a lot of fun. The technical shortcomings of the DOOM Engine are evident in extreme circumstances(like when the camera tilts up or down too much), but other than that, the only other technical shortcomings is on some maps like Sub-Zero Peak Zone where going off the path into deep snow renders the kart improperly.

As a multiplayer game it performs very well. As a modded game it performs very well, though it could benefit with a more integrated modding scene I.E. with either an integrated or a simple modding system. As it stands, it's just downloading a .pk3 file, placing it in the addons folder and playing the game. On startup, you need to enable all the relevant addons. In this regard there should be an easier way to enable the addons you use on startupt(I suspect there is, and I'm just unaware of it, as some default .pk3 files are used in the base game but considered "addons"). For multiplayer it's click and play. The client downloads all the relevant .pk3 files from the host. This does suffer an issue though, a server with many addons results in a lot of time spent for clients in the downloading of these addons. Beyond that there's no issue.

For Arch users, I had a few issues getting it to run. The publicly maintained aur package "srb2kart" has issues compiling due to an included header being wrong.

#include "SDL_mixer.h"

Not being found. To fix this, you will have to compile it from source. I'll list my steps below.

git clone https://github.com/STJr/Kart-Public.git   # Get the source code and dependencies as normal
yay -S srb2kart-data
pacman -Syu sdl2 sdl2_mixer
vim Kart-Public/src/sdl/mixer_sound.c               # We need to change an include header

in mixer_sound.c change

#include "SDL_mixer.h"

on line 31 to

#include <SDL2/SDL_mixer.h>

Build the game as normal

cd ../../
make -C src/ LINUX64=1                              # If using 32 bit linux the command should be LINUX=1
./bin/Linux64/Release/lsdl2srb2kart                 # If using 32 bit linux, use the Linux dir

After these build instructions have been followed you can have fun. These build instructions are needed for me running under Arch Linux, so your mileage may vary.

=> Repository to git clone

It also seems to have dedicated servers. I might host one :^)

=> The game can be found for free here. Instructions for other linux distributions are included.
=> Link to post, and comments here

pass

The only password manager that matters

What is pass? It's just a password manager. It works by using gpg encrypted files which are your passwords. This means they are encrypted and safe. It's also a password manager that is offline, so that means you have no issues with hosting it. You can if you would like, host your gpg encrypted passwords in a git repository, or in the cloud, or on an online machine so you can get them using rsync or some other file transfer tool.

There's a number of commands I will share here.

pass init passwordStoreName

This lets you initialise a password store. A password store is just a place for your passwords to be stored hence the name. This is where gpg-encryption comes in, as all the passwords are encrypted using your gpg password. There is technically more involved that you can find out using `man pass`, as pass is really just a very nice and convenient frontend for gpg.

pass add passwordName

This lets you add a password under that password name. When you do it, you'll be prompted for a password. Unless I am adding a password that I already know, I usually do not use this, as the next command is far more useful.

pass generate passwordName 20

This lets you generate a cryptographically strong password for the password named passwordName that is 20 characters long. It's fairly obvious why this is good, because it means you don't use the same password for each online service, nor have to remember them.

pass ls

This gives you a list of all password names in the password store.

Now you know all the password names(not their values) in your password store, now you need to acquire a password? Simple.

pass passwordName

This will prompt you for your gpg password, and if successful in decryption, give you the password. There is more you can do with this command that is beyond the scope of this article, such as multiple password stores, and using more than one gpg key for your password encryption. There's even OTP(One Time Passwords, the timed 6 character codes) that you can use with it.

A demonstration use is listed below.

#!/usr/bin/env bash

shopt -s nullglob globstar

typeit=0
if [[ $1 == "--type" ]]; then
	typeit=1
	shift
fi

prefix=${PASSWORD_STORE_DIR-~/.password-store}
password_files=( "$prefix"/**/*.gpg )
password_files=( "${password_files[@]#"$prefix"/}" )
password_files=( "${password_files[@]%.gpg}" )

password=$(printf '%s\n' "${password_files[@]}" | dmenu "$@")

[[ -n $password ]] || exit

if [[ $typeit -eq 0 ]]; then
	pass show -c "$password" 2>/dev/null
else
	pass show "$password" | { IFS= read -r pass; printf %s "$pass"; } |
		xdotool type --clearmodifiers --file -
fi

For reference, dmenu is a suckless utility. This script is not my own, but is zx2c4's script I am using as an example.

=> Passmenu source code.

This script takes the list of passwords, lets you select one in the interface created by dmenu, and when you select one, copies it to the clipboard so you can paste it in the respective password box for any account you need to log into. I also have this mapped to a keybind in DWM, "Mod+Shift+P", so any time I need a password, I can get it("Mod+P" is used for pausing music which I do more frequently than I get passwords).

As you can see, it follows the UNIX philosophy pretty well, making it very useful as a password manager. It dodges the issues of contemporary online password managers that are centralised and very much so freedom-restricting, leveraging the passwords to make money from the end users. The fact it's gpg-encrypted means it's also good to use with cloud storage solutions. Overall, it's a tool oft-overlooked by people.

=> Link to post, and comments here

Pman : Manual pages, formatting, roff and PDFs

Pman?

Not to be confused with pacman. Its a short script I wrote in order to view man pages in my pdf client. Wait, what's a man page I hear you ask?

man ls

This command, "man" will read a roff file and format it appropiately so you can read it. A roff file, is just a plaintext file formatted in a nice way. How nice? Every change in formatting in your text is a line, but this isn't an article about roff. The thing about roff to know, is it was used in the old UNIX days and that man pages are written in them.

The script "pman" in question is this:

#!/bin/sh
man -Tpdf "$@" | zathura - ;

The flag "-T" informs Troff what to convert the troff document into. "pdf" tells us to conver the Troff document to a pdf(I presume by converting to PS, then to PDF, but I may be incorrect). We must then pipe it into a pdf reader, in this case I use zathura as that's my preferred PDF reader.

Other options can be used for -T, including html, ps and dvi(not that you'd want to use the latter 2). HTML and PDF options are useful to me, as it means man pages can be printed or uploaded to the internet as a reference. In fact, I wouldn't be surprised if using this, is how most manual pages are done on the web!

Some points to clear up before I move on, and some alternative tools to mention. Firstly, Groff? Troff? Roff? Roff is the original, followed by Troff, followed By Groff(GNU Roff). The differences are minor between them, since they are just continuations and developments upon what was laid down previously. In fact, it's arguable that Roff laid the foundation for programs such as Donald Knuth's TeX program that is used widely in academia.

The alternative tool is "info" another GNU program. I am not a fan of it, as it's effectively a full on TUI for reading and searching manuals, as opposed to a simpler CLI that "man" is. Within emacs, I am aware of a program known as "W.O. Man" which stands for "Without Man", which is another program for reading manual pages.

Do you want to learn more about the "man" command in zathura? Run this:

man -Tpdf man | zathura -;
=> Link to post, and comments here

Online Communities

I have decided to write this because I feel that a major flaw in a lot of social media is the problem of communities. A community is quite simple a group of people sharing attitudes and interests in common. Sure, something utterly meaningless can be this common interest, memes, an E-Celebrity- however these communities often seem exploited to some end.

Memes can be forced, and E-Celebrities only want a community for a paycheck. A recent example is the backlash to this.

=> SNL Gen Z hospital.

In some cases, communities have huge sweeping fundamental differences, that it's hard to really call it a community. One such example is the Linux community. Effectively the only common element among them is the quality of using the Linux Kernel. Init systems differ, package managers differ, desktop environment, window managers, text editors, utilities like ls and dir. All these different elements differ. As a result, of only sharing one common interest, this community is massively fragmented.

This fragmentation is in many ways good as it's invoking the very freedom of expression so critical to these freedom-promoting projects.

In situations with little fragmentation, you see an issue in this regard. The Anime community is one such example as a great majority of Anime is uniformly bad, that there's not really much to be fragmented on. This naturally means all Anime tends towards exaggeration rather than reality. Movies are another example. You especially see this in individuals and brands elevated and deified by the masses, such as Elon Musk, and plenty of E-Celebrities.

Then again, this fragmentation and lack of fragmentation is also partly platform-driven. Within the chatrooms the only method of disapproval is a text message, and the only "algorithm" is the most recent message. Within forums disapproval is presented with points. The same is done in Reddit and YouTube, where the algorithm is done away with from bump-based recency-based systems to complex mysterious systems.

As a result, it's pretty well known that these platforms thrive on a lack of fragmentation. A lack of fragmentation is a status quo. A status quo is predictable. If the actions line up with the predictions, the predictions can be monetised. If they can be monetised, they will eventually be monetised. As a result, the data is sold as a means of targeting and predicting.

This fragmentation and lack of fragmentation is seen in culture too. The fragmented groups are usually a response with a cultural basis. GNU is a cultural response to the proprietary UNIX systems. Punks and Anarchists are a cultural response to the two-party political status quo. The groups with a lack of fragmentation are usually the status quo groups. Windows is one example. Two-party systems are another(though that's a matter of game theory).

This post is to point out that communities have a context, and can be fragmented. That fragmented communities invoke freedoms more, and as a result of demanding and establishing their rights and freedoms, are in actuality more free. It's also to point out that the lack of fragmentation, is driven by the context of the platform, and this lack of fragmentation is frequently exploited to make a quick buck off predictable echochambers.

In some respects, I feel sad about it. Regardless, the benefits of a *good* community shouldn't be understated. You get new ideas, new information and material and emotional benefits as a result.

As a final addendum to this, some people define communities to be people who live near each other, or have met before. This is a poor definition in my opinion, but there is a discernable difference in value between online friends and friends you have met before, and a lot of it comes down to game theory and the social behaviour of Humans. In the former, a prisoner's dilemma plays out, where a betrayal is often massively rewarding as there are no further interactions. In the latter, it is a Human desire and interest to be socialising. In my opinion it is a massive disservice to the full abilities of a Human if they are reducded down to plain text, but that disservice brings its context, and often rears its ugly head especially on platforms like Twitter is small character counts that result in curt, dismissive responses to appeal to an international audience. I may write a bit later about the difference in value of a friend that exists only online, and a friend you have known in reality.

=> Link to post, and comments here

YANChan.xyz

What is it?

YANChan.xyz is an imageboard for discussion of a bunch of topics. For now topics include, art, videos, games, technology and random.

It is available on the large web at YANChan.xyz

=> https://yanchan.xyz

Why is it a thing?

Facebook, twitter and most forums are like-based. This brings about issues of content circulation being algorithm-based and often reinforces uniform agreement and echochambers.

Some forums, and imageboards are bump-based. This means things that see a bunch of discussion are seen more. This means both, agreement isn't needed. In addition imageboards are anonymous by default with opt-in verifiable identities. This promotes discussion on touchy topics, and makes criticism easier to dish out without any significant fear of censorship other than by platform owners(myself).

The issue imageboards suffer from is at scale, they become shocking and provocative as that is what gets discussion. At a small scale for a small community these issues aren't present.

The issues forums also face is down to a lack of anonymity. In convincing a person of an argument, the ethos, pathos and logos are to be considered. Logos being the logical reasoning. Pathos being the emotional string-tugging, and ethos being the credibility and reputation of a person. Forums are commonly filled with pathos and ethos-based arguments, as arguments are easily attributed to a person, so emotions are at risk, and credibility and reputation is also at risk.

As such, I deem the best social media solution is a bump-based system at a small scale.

Go hang out and make some posts if you want. :^)

=> Link to post, and comments here

The problem with Bitcoin

Introduction: What is a Bitcoin anyway?

Bitcoin has been skyrocketting in price recently, so I thought I'd write a little about its problems. First an introduction to what Bitcoin is. Bitcoin is a blockchain that records transactions in a trustless manner. As a result, anyone can download the entire blockchain and view it and all transactions that have occured between all wallets. How a blockchain is constructed? In Bitcoin's case, it uses a proof of work system. Proof of work is the idea that, by adding something to a blockchain it's very hard to find a hash with specific qualities(In Bitcoin's case this is the number of leading 0s in the hash value). Finding some value to add to this block to find this hash is computationally expensive and to this day, there exists no better method than just going through possible inputs and just searching for a valid result. Once a valid result is found, the miner who mined that block gets some reward.

The purpose of mining is to put work into a blockchain and blocks. The blockchain with the most work put in, is the accepted blockchain, which is where the theoretical 51% attack exists. If a person owns more than 51% of the computational power for mining blocks for a blockchain, they effectively set the accepted blockchain. These blocks are fixed size, and store a list of transactions to add to the blockchain. Additionally, there is incentive for miners to prioritise certain transactions for the block based on the transaction fees that people can pay to miners. I have linked a video for further understanding if my explanation fails to satisfy people.

=> A video by 3Blue1Brown going through it in detail from first principles

Now that we have established how Bitcoin works, let us walk through the issues it has.

Issue 1: Fixed Block Sizes

This issue is simple. Bitcoin's blocks have fixed sizes. There have been proposals to increase this fixed size, as well have it dynamically increase over time. To establish why this is an issue, it effectively bottlenecks transactions. Only 1MB worth of transactions can be transacted for each block in the network. This massively restricts how many transactions can be made when compared to fiat equivalents like paper cash, gold or paypal. It's quite a simple problem to state, but the solution isn't clear.

=> The Block Size Limit is discussed in detail on the wiki.

Fixing this for Bitcoin would be difficult in my opinion. The fix as I see it, is a more dynamic solution that changes according to transaction demand. Even with this solution in place, it only guards against surges in transaction demand, but it doesn't fix the core issue. That issue being block size transaction limits, as the blocks sizes would likely need to be clamped to reasonable ranges still, as to allow people to keep running their own node for the network.

Issue 2: Proof of Work

Proof of work is a solution, but it has a few issues. The first being that the electricity put forward isn't of much use to people in society. It's only useful to the people in the set of people who have put work into that network. I will say this is a moralfag opinion, but considering the nature of current worldwide electricity and global warming, I think it must be stated.

There are other issues too, the problem chosen by Bitcoin for their proof of work system is hashing. This can be effectively handled by GPUs and ASIC machines, but this poses a supply problem to legitmate consumers needing that hardware. This also makes it less egalitarian, as through no fault of the poor, they can't establish their own stake in the network very well(effectively filtering those who can't afford good GPUs or ASICs which cost thousands of dollars).

The final issue with proof of work exists only in theory. I have mentioned it earlier, but a 51% attack could theoretically occur. This is described as when a person owns 51% of the computational power of the network. This is a problem because if a person owns more than 51% of the computational power of the network they can run the hashing algorithms faster than others. This continues to be a problem, as you can then validate correct blocks, and even validate incorrect blocks. As the standard blockchain used is the chain with the most work put into it, this invalidates the standard blockchain.

Issue 3: Privacy

This comes in two flavours. Personal and Systemic privacy. I will discuss the personal case first. Almost all places to purchase cryptocurrency with money require ID. This means that the addresses of Bitcoin wallets and who owns them can be known, and the transactions made can be monitored. The amount of money they have in Bitcoin can also be known. This presents a bunch of issues, firstly that Governments can have a greater deal of information to monitor and regulate the flow of money in the network(far more than they can with cash or with a credit card). In reality, Bitcoin is pretty much the wet dream for any authoritarian Government, as the flow of cash and transactions can be traced, and the owners of that cash can be traced.

This is bad because people know how much you have in Bitcoin. This is also bad because your transaction history is full and public, so anyone can find out that you purchased contraband like weed or other things of the such. It also makes it much easier to tax when everything is so transparent. For this, Bitcoin's blockchain is not fully trustless.

In the personal case, you're affected as this information is transparent and easily regulated by a central authority(your government). In the systemic case, you're affected as it imposes regulations and relates your address with your identity.

This entire problem isn't solved still, but is much better assessed by cryptocurrencies such as Monero and other "privacycoins"

Conclusion

A lot of people invest in Bitcoin simply because others invested in it. Bitcoin has technical issues that are far from solved that even other cryptocurrencies handle and solve better than this. I will list the coins that I think are worth watching. Ethereum as it generalises the blockchain to contracts. Monero as it provides much better privacy. Cardano as it avoids issues of proof of work by using proof of stake. There are others that have higher transaction volume like Litecoin.

Ultimately, I expect Bitcoin to be dumped eventually in favour of technically superior cryptocurrencies that have more merit and qualities.

=> Link to post, and comments here

Tech Portfolio

Hey guys, I set up a tech portfolio to show off my projects. It's on the HTTPS web.

=> Here's the link!

:^)

=> Link to post, and comments here

The modern web is bloated

This will be a short blog post, explaining in detail why I think the modern web is bloated.

The problems

Website sizes. Alright settle down zoomers, with your gigabig chungus internet speeds. A lot of people still have very slow internet speeds that cannot handle over a megabyte of network activity. I can sympathise with the webdev that has to add ads or trackers on account of their employer's demands, however I will go on record saying that these 2 points are particular areas where the web speeds throttle. You have lots of useless Javascript getting loaded, and uncompressed ads and pictures that aren't even properly resized, among other issues. It's a MESS, and it's precisely why I use Brave Browser(it filters most of this shit but not all).

=> Luke Smith demonstrating the average size of Chicken Parmesan recipes. Each chicken Parmesan can store DOOM at least 3 times if not more!

Processor speeds. You know how you use useless Javascript to add ads that contribute nothing to your website? Yeah? Well they have to do their processing somewhere, so they do it on your CPU. Each browser handles this slightly differently so performance differs between them(As an example Vivaldi and Qutebrowser are slower than Brave in this respect). However, the problem here is that if an i486 can run doom, it should be able to run your website, and for the VAST MAJORITY of websites this would almost definitely not be the case. Remember, a lot of people also don't have the processor needed to handle these websites. Raspberry Pis and low-resource machines(which are getting more and more common due to the internet of things) get utterly destroyed by the processor requirements.

Anti-features. The original point of HTTP is a Hyper Text Transfer Protocol, and the original point of HTML is a Hyper Text Markup Language. People think that on these 2 points, it's fine to manipulate the document all over the place. This is an example of anti-features, as an example, when I search something on google and click the first link, and find it's trash. I will back out to the list of links, and click the 2nd link, but OH NO WAIT, Anti-features, engage! It props a little window under that link with "Also searched" and a bunch of shit I didn't want to click. On this point, it's also very hard to copy and paste content properly. Quite simply people implement dynamic shit, and I theorise it's front-end idiots because they think that all they can do to justify their job is to keep making it flashier and more dynamic. NO, that's how you piss me off. Another point on this, is websites that override the natural scrolling system, into some vomit-inducing motion-sick mess.

Accessibility. This one's simple, firstly, FAR TOO MANY WEBSITES ARE SCREEN-READER UNFRIENDLY. Holy fuck! How did we even get to the point that presenting fucking text, is too much for a screen reader? Quite simple how: because people use HTML(structure and content) as a way to force style. You should only do the minimal required structuring for the sake of style. Another point on accessibility is far too many people think low contrast, gray text with light gray background is good, and it's hard to read.

Implementability. This one's simple, the standard for HTML is insanely complex now, that only 3(maybe more?) viable web engines exist for rendering HTML pages(okay, yes there's more out there, but they are unusable for any site more complex than an imageboard or a normal forum). I'll list them now, Webkit, Gecko, and Chromium. First off, just fucking cut out Webkit, it's lagged behind quite a bit. Gecko and Chromium are the only 2 engines worth using on a modern internet, and that's why almost all web browsers use them(As an aside, I said yes there's more engines, Lynx and w3m both use their own engines but they only display text in a terminal, and usually it's utterly broken for most sites on the same basis as the screen-readers being broken. It's a pretty good way to tell if it's broken though).

The solutions

Stop writing shitty websites, that pull a thousand JS dependencies through a convoluted CDN to deliver anti-features to people and consume their bandwidth, mobile data and processor speeds(the last 3 points are also responsible for major mobile phone drain). Write your HTML, write your CSS, and write your PHP or CGI. This is how it was meant to be done, and this is the way that will stand the test of time, rather than outdated laggy glitchy Javashit. Of course, most people won't adopt this method as most people use shit like Medium or Wordpress or God-Forbid some other trash like it, to make websites/articles.

I'll also point out other protocol alternatives. Both aren't viable commercially, but for hobbyists they may be of interest. Gopher which is pretty much plaintext with restrictions of 80 column width without word wrapping, and a directory link structure. This is pretty outdated, and the flaws in it are assessed by Gemini which handles word wrapping and uses a markdown-like format. The protocol(read PROTOCOL, not the text format) is close to being frozen if it's not already, and that's actually a good thing. It means that people will be able to make clients and servers for gemini for years to come in the future, in a short time span AND as the text format, and it means it's relatively inextensible so people won't be able to extend the protocol with useless shite(as has happened with HTTP). Also look at IRC, that chat protocol is so simple that it has stood the test of time FAR BETTER than any proprietary or extended protocol.

There's pretty much no easy clean solution. One is to stop using HTTP and the other is to enforce good practice of HTTP(ironic how few people follow good practice because of how advertising has made the modern web terrible). Also here's a game to try. Take a webpage that is obviously bloated and convert it into a website with HTML, some limited CSS, and the images that are useful, and see how much faster it is on bandwidth compared to the original. What? It's 10x+ faster than the original? UNBELIEVABLE! SHOCKING! TERRIFIC! Now go and put those gains into practice on your live sites.

=> Link to post, and comments here

RSS ported to work again

Hey. I got the RSS feed to work with my Gemini blog now. This means that between my Gemini and HTTP pages, you should be able to subscribe to my feed. I will probably update my index with this information.

I will provide informations on my script, even though it's shit. It's just chaining cats, echos, seds and tail and head commands.

I'll post sources later on. For now, I will need to use it a bit to do some testing to understand what flaws it currently has.

Strange is the night where black stars rise,
And strange moons circle through the skies,
But stranger still is
     Lost Carcosa
- The King In Yellow
=> Link to post, and comments here

Site is now better on Mobile

Short blog post. I have slightly modified my Gemini to HTML script to include a meta tag that makes pages scale properly on mobile devices. I *think* I have also applied this same change to the UTC news archive so it should be easier to read on a mobile device.

I'll also leave at the bottom that my priorities with this site will be first and foremost towards being readable on desktop platforms as that's where people who generally aren't braindead are as Phones are effectively time-wasting novelties that contribute nothing of worth.

=> Link to post, and comments here

Richard Stallman Situation

Richard Stallman resigned from his post in both MIT and the FSF. The basis of this were statements he made that were in generally poor taste regarding child sex. Poor taste because of the recent events with the Jeffrey Epstein sex trafficking scandal. Recepients of Epstein money and Epstein prostitutes included other people at MIT, but not Richard. As a result, MIT wanted this deal over as soon as possible.

Despite all this, Richard made points against Jeffrey that the public at large misconstrued as him making a defense of Epstein due to his semantic discussion over child sex laws. I think Stallman's opinions on child sex laws are pretty stupid, but the full extent of it is variable. 17 year old woman presenting themselves as entirely willing for sex is labelled as sexual assault, much the same way that a savage raping a 10 year old child is, and it's on those 2 points that one can see that sexual assault covers a large umbrella of events and is entirely a poor description in many ways. This same argument can be applied to many other umbrella terms such as security, freedom and privacy.

Do not mistake my support of Richard Stallman in the current circumstances, as a support of his opinions. Although I see the issue presented, it's just legal jargon. The main discussions are on expansion of the meaning of Sexual assault, which is primarily a discussion of semantics and other legal pedantry, and a discussion on whether a child can be an entirely willing sex participant. I hold the belief that they can't mainly because although they will typically have an understanding of what sex is, but they won't have an understanding of the full consequences of sex in the form of STDs, psychological issues and mental issues. There's a reason a lot of prostitutes end up having "Daddy problems". Richard Stallman argued at one point that sex before age of consent wouldn't cause issues if it wasn't coerced or imposed, although since his resignation he withdrew that opinion on basis that he had personal discussion with others and learnt it does have an effect. I would take a moment to remind people that playing devils advocate and discussing policy is not an endorsement of the actions a person discusses. Thus, claims of pedophilia simply don't hold.

Other points commonly brought up against him include him asking women out on dates straight up without any foreplay or any other bullshit. Most denied, and some use this as ammunition against Stallman to claim he is a sexual deviant. Stallman didn't make multiple requests to people, he simply moved on as he grasps consent. As a result of both sides being entirely respectful of the other's consent or absent consent, I am completely uncaring of this point. Some claim it's sexism. I claim it's stupid, and a plain false point of sexism.

Among other points include his opinions on Eugenics. Stallman has gone on record saying that a fetus should be aborted if it has Down Syndrome. I agree. The majority of Doctors agree. Mild genetic illnesses is where arguments of eugenics are harder to justify, including deafness and blindness as technology has improved rapidly. In the extreme cases though I agree and the less extreme cases I do not think it should be aborted. Still, this is a question left to the context. Women abort perfectly healthy fetuses simply because they cannot sustain another child in a stable family and that's a perfectly respectable reasoning, and as such, I see no reason why arguments of Eugenics given the appropiate context, and absent of politics or religion should be discussed. I say absent, because most religions are pro-life, and because some politics place preference for some particular genetics, such as the Nazis and Aryans. Personally, I'm indifferent, and it's especially a non-problem in the case of Stallman as he has gone on saying that he won't have Children. This is the point that Stallman is ableist. It can be argued, but you must hold the same argument consistency towards all abortion cases and medical advice by Doctors and that is simply ridiculous to hold. Thus I think it doesn't hold.

The only other remarkable point brought up is apparent transphobia. This argument simply does not hold. Stallman has made a proposal for more gender neutral use of Spanish(those who know Spanish will know there are Masculine and Feminine words). He has also applied his discussion of language to gender neutrality making the claim that "they" shouldn't be used, and other words should be used, on the basis of ambiguity as they is often used in a plural situation. It does not strike me as a claim with any supporting evidence, so I see it as plain false.

I will remark shortly on the reason for this post. Richard Stallman is back on the FSF board. There are people who would prefer to see him gone from the FSF despite his absence for a year and a half. I am not one of those people, as I am generally agreeable to policies, principles and ideals that lead to liberty and independence- and Richard Stallman fights for these ideals and principles within the Software environment. From the basis above, I see him as entirely reasonable and well thought and deserving of respect for his unyielding principles. Sure, sometimes his arguments make stupid conclusions, but firstly stupid conclusions aren't endorsements, stupid conclusions are intentions, and stupid conclusions are part of every person's right to be wrong.

It's also why Richard Stallman is involved with Software, and not with policy. Completely ignoring that a lot of the participants in politics are either declawed animals, or two-faced acting in corporate interests.

Of note, there are two letters. A support letter and an open letter against the FSF. The latter wants all the board members to resign or be cast out. The former asks for the current situation to stay the same.

I support Richard Stallman, and Libre Software; I most definitely do not support unchecked demonisations of reasonable people, or mob-defined digital witch hunts, spurred by the curt and mean-spirited twitter or other social media hurricane.

I have signed my name(Aaron Leonard) on the support letter, as an act of support for free software, and proper discussion form. I have linked below the two letters, one in support of RMS, and the other against him. Look into the discussion yourself if you would like.

=> Support RMS Letter
=> Open Letter against RMS

Do not mistake my support of RMS as a support of sexism, transphobia, sexual assault, or pedophilia. It isn't, and I abhor all of them. I simply do not see how the accusations stand, and judge most them as false. The only thing Stallman is guilty of, is playing Devil's Advocate too much, and making statements in poor taste.

=> Link to post, and comments here

Who games on Linux? #2 Loop Hero

Game info

- Name: Loop Hero

- Genre: Hybrid of card games, roguelite, city builder and idle clicker genres

- Demo: It has a demo on Steam

- Linux: It has a native Linux build. Works well

Thoughts

Earlier this week I picked up Loop Hero after hearing about it from HexDSL. This is an incredibly interesting game both to play and from a design perspective. I'm rarely a fan of idle clicker games or city builder games, and don't play many card games, but this just hits the sweet spot. The way the game works, is that there is a loop of tiles, and you go around. There's a day timer, and a number of completed loops. The latter counter scaling the equipment strength and enemy strength, and the former is for tile triggers to occur.

When you fight enemies you have a chance to get a card(from a deck you build, so you can make card synergies), and you have a chance to get equipment. The equipment has plain statistics like health, defense E.T.C. but it also has more interesting statistics like vampirism, counter and evade among other interesting statistics that can be used to make a character build stronger. The fights play out in an idle format, but you can't completely idle as you need to be swapping out armour and also placing tiles.

When you place tiles, you usually get resources. Tiles can also change if specific tiles are next to other specific tiles, and as a result you need to think about what kinds of tiles you want on your board. They also have placement rules, so that has to be thought about- and they usually have an effect on neighbouring tiles.

It's a really interesting dynamic of building a character, placing tiles that spawn enemies you want, or create dynamics with other tiles you want.

The game also doesn't asphyxiate itself in introduction of content and interesting content too. There's plenty here to play, and there's plenty of card synergies. This is one of the things that free-to-play games rightfully do, as they do need to make an income.

I also haven't mentioned much about resources. Resources are taken to your home camp so you can build it up in a "clash of clans" style home base, where after each expedition, you gain more resources- or providing resources for your next expeditions. This plays into the roguelite aspect as you can gain new cards for your deck- however this resource management also plays into the gameplay in expeditions. Whenever you die, you take 30% of your resources back to base. Whenever you retreat back to base anywhere in the loop, you take 60% of your resources back to base. Whenever you're on your home camp, or the boss of that expedition has been summoned(bosses are summoned by placing enough tiles) you can take all of them back with you- so you have this dynamic of risk-reward... should I do one more loop, or make it back with what I've got?

It's a marvel in design showing what happens when you thoughtfully take the best elements of different genres and put them together to make an entertaining, relatively chill experience. The best bit is that it's not brain-numbingly boring as is a common experience I have with either city-builder or idle-clicker games.

On other notes, the art style is very nicely done using the Commodore-64 colour scheme and excellent art- giving the game a nice and bleak aesthetic fitting for the narrative. The narrative, as I've played so far, doesn't seem to be anything big or interesting. I guess it's like the classic Carmack saying "Story in a Game is like Story in a Porn Movie: It's expected to be there, but not very important". Music and Sound design isn't really anything spectacular. It loops nicely and doesn't grate on your ears, but that's after playing for 5 hours. I wonder how much longer it will be before the music gets annoying, as it is just looping music.

Anyway, this game runs well on Linux, and is a blast to play. Go check it out if you haven't already as there's a demo for it.

Final Information

- Played Version: V1.012

- Linux Compatibility: Native

- Hours Played: 5.6 hours

- Will I return to it: Yes(I haven't completed it at time of writing), and yes if more content is added

=> Link to post, and comments here

Who games on Linux? #3 NERTS! Online

Game Info

- Name: NERTS! Online

- Genre: Solitaire style Card Game

- Demo: It's a free game N/A

- Linux: Native Linux build

Thoughts

Free game from Zachtronics. A lot of fun, like most of their games.

It runs without issues. The only 2 issues I faced are as follows. With DWM, if you use a border for your windows it doesn't fullscreen properly, leaving a blank window. Easily fixed by toggling borders. The other issue is fundamental in the way the game does networking, as it seems to use TCP-Networking. Any lag experienced by you or the host will effect you negatively resulting in late "clicks" and the game feeling unresponsive. If the players live in the same country this won't be an issue but if they are foreign and your latency is aroudn 300ms you will feel it.

As for the gameplay, it is wonderful. Easily picked up as most players know Solitaire. The UI isn't cluttered, so you can easily jump into a game, or change settings. The main differences is how it does scoring and the NERTS pile. The NERTS pile is a selection of face down cards that have to be revealed one by one, and used one by one, in order to reveal the NERTS button. Hitting the NERTS button ends the round, and for each player with cards on the NERTS pile they get -2 points. You score +1 point for each card you place in the "points" stacks in the middle, which is very much like the 4 suit stacks in Solitaire.

As for your individual gameplay, it's like a more constrained solitaire, where you only have 6-4 usable piles, and a shuffle deck and play it like regular solitaire. The difference is in how scoring is done, so hitting NERTS quickly is rewarded, but you can play normally and optimise on score.

It's a lot of fun, I highly recommend it to anyone.

=> Link to post, and comments here

Mass Surveillance at scale has resulted in murder

=> I read about this originally here. Drew DeVault did an excellent piece about it.
=> I originally discovered this article on his Gemini page. Go read it with Gemini, since it's so much better!

But I just want to step back and ask... How did it get to this point in the first place? That advertising technologies have now been used as intelligence that is used for more than just psychological manipulation of the masses. It's used as an attack vector on individuals.

Sure you can make the claim "gotta keep bread on the table", but in doing so you are complicit in the murder to some degree. By your use of such technology you support a bad status quo, no matter if it's simply a more networkable option or more functionally useful option. You are complicit by supporting such a status quo, and providing power to that network.

Your software choices have moral and ethical consequences. If not on an individual level, at the very least on a systematic level. This can also be seen in the 2 choices that the "alt" political groups seem to take, when censored on mass centralised data collection mediums. They move to another mass centralised data collection medium that appeals specifically to that audience(*ahem* in what is effectively a self-masturbatory echo chamber) or move to federated choices where both censorship and network is more decentralised.

Regardless of your software choices, this rabbit hole goes deeper when we consider how thoroughly drenched in poison the current state of the modern Web is, rife with trackers, poorly sized autoplaying and bandwidth-hogging trash advertisements that pay for their existence by bribing morally and creatively bankrupt scum to abuse advertisements. No matter, it's the default financial language of the current sad state of social media probably supported by how powerful data is for targeting(as has resulted in the sad state of targeting individuals by purchasing data).

Data is a deadlier but stronger financial weapon. You own the data? You basically own easily mined statistics that can be weaponised against your users by doing psychological operations on them with targeted advertisements, and physical operations on them by injuring their way of life. It should not be treated as lightly as it currently is.

And yet developers continue to treat it lightly, in all these shit dead-end startups and dead-end companies that will go belly-up and as collateral damage the needlessly added tables of the uncaring "I just want to put bread on the table" developers.

Hell, you've already suffered collateral damage by leaked passwords, as a lot of sites ignore basic security notions such as storing salted hashes of passwords(regular hashes are unsuited due to the existence of rainbow tables).

It's an abysmal state of affairs. Naturally, I wouldn't be too surprised if anti-citizen military groups such as the NSA, CIA, GCHQ or SIS have already weaponised tools based on these mass data surveillance piggy banks. It'd be foolish not to think so, as only a decade ago, the Snowden leaks proved that the NSA and related "Glowies"(a satirical name for abbreviated security groups, originating from the schizophrenic mind of Terry Davis) were well ahead of the trend with their tools.

How far ahead of the trend are they? Well the reason the WannaCry ransomware attack is a big deal is because of the NSA hoarding a stash of zero-days that affected most Windows Operating Systems under the sun in a thoroughly devastating way.

I've rambled a bit. I doubt this is the first time data collection methods have resulted in actual death, it's part of the intelligence gathering objective that most spy agencies have-- an objective made significantly easier by the devastating commercialisation of data gathering.

I will make an assumption that you, dear reader use data-gathering software and as a result are complicit in such activities. Windows, Twitter, Discord, Faceberg and a helluva lot of trash media exists. Want alternatives? Linux, Mastodon, Rocket.chat/Qtox/Mumble/IRC. I have no alternative for Faceberg as it's utterly useless and if you use it, you should feel a deep contempt for your younger and current self for using it.

I would also say in general *ANY* social media with the capacity for like systems or similar systems should be seriously considered, as it allows for a poorly thought-out sensationalist argument with significant social indicators for which side is "winning" or "losing" defeating the very point of arguing which is as a method of contrasting and discussing ideas through a person with completely different experiences and containing different knowledges in their gray matter.

As this article holds some speculation I will leave a classy quote from the great Terry Pratchett. Just remember your hands are stained with the gore of your choices.

Some think this is paranoia, but it isn't. Paranoids only think everyone is out to get them. Wizards know it.
=> Link to post, and comments here

Perfume: The Story of a Murderer (Book)

This is a book I read recently. By recently, I mean I ordered a paperback of it, and when it arrived, consumed all 260 pages of this wonderful fiction in 2 days. The only fault I have is regarding the ending being as I feel, quite rushed, although that only particularly affects the last 20 pages of the story. The rest was wonderful to read.

The story follows Jean-Baptiste Grenouille a person born who has no odour but has the most capable nose at smelling scents and deconstructing and constructing new combinations of scents. This is of note because very few books(fiction or non-fiction... implying that non-fiction should even do it) have invoked the sense of smell so much and so thoroughly throughout. It's quite honestly very interesting and engaging simply because of how it's a master class in the old "adage" for literature writing "Show don't tell".

It goes through the entire life of the person, from birth all the way to death, and all his actions in between. Consisting and delivering a selection of feelings ranging from disgust at the warped eroticism that is present, alongside a misanthropy for human behaviours and motives. It takes place in the middle of the 18th century, so you do see an interesting interpretation of pre-revolution France. It also does align relatively well with expectation, so worries about how difficult it is to be properly immersed aren't too hard to fix.

I won't spoil anything here, but Patrick Süskind delivers a truly brilliant narrative fiction throughout this book, and I would urge anyone who enjoys fiction that truly does "Show" how a huge variety of objects and things smell, and their relations to human behaviours and the other collection of senses. It was a wonderful 2 days I spent immersed in the book.

Also as a note for people reading, I didn't watch the movie version of it. I may or may not watch it... if I do, it's unlikely I'll post back here about it unless it's something worth talking about. I find personally, the vast majority of Movies don't really do much for me. They have to cover too much in too little ground, and this is made painfully obvious by anyone watching Bladerunner and comparing it against its source material "Do androids dream of electric sheep", although Bladerunner stands up on its own legs for its own merits.

I'll also note here, I read a translation of the book translated by John E. Woods into English from its original German. Doesn't mean too much, but if you do read it or acquire it through some means, make sure you get a decent translation.

It's a good fiction and a good time. I recommend it.

=> Link to post, and comments here

The iterated Prisoner's Dilemma

Introduction to the Prisoner's Dilemma

If we are to discuss the iterated Prisoner's Dilemma, we must first discuss it with 1 iteration. The idea in a single iteration, is that you and another convict have 2 choices available and there 4 possible outcomes. For simplicity, we will name the prisoner's Cain and Abel, and the options Betray and Cooperate. If Cain betrays, and abel cooperates, Cain gets free and Abel serves for 3 years. If Cain betrays, and abel betrays they both serve 2 years. If Cain cooperates and Abel cooperates, then they both serve a year. I will present this with the below tree.


Cain             Abel
--------------------------+
          /--> Cooperates | C=1, A=1
Cooperates                |
          \--> Betrays    | C=3, A=0
                          |
     /-------> Cooperates | C=0, A=3
Betray                    |
     \-------> Betrays    | C=2, A=2
--------------------------+

It's from the above, we can see that in both outcomes, Cain gets a lower score(lower=better in this instance) by betraying. If he Cooperated, and Abel Cooperated, he could've gotten 0 time. If he betrayed and Abel betrayed, he would've gotten 3 time. As Betrayal is the best option, it's what is referred to as a dominating strategy if taken from a purely rational perspective. Of course, usually the situation would be a more complex moral and ethical problem(like the trolley problem), and the context for who is who does come up, but this is beyond the point of this discussion.

Introduction to the iterated Prisoner's Dilemma

In the iterated prisoner's dilemma, we play this game multiple times, with both Cain and Abel being able to remember all the outcomes from before. The final outcome being the sum score of all iterations. If Cain and Abel always Cooperate this is how the scores will look. In this I will be playing up to iteration 3, you are free to extend it yourself:

i | Cain | Abel
--+------+-----
0 | 1    | 1
1 | 2    | 2
2 | 3    | 3

As we can see, they both end up with 3. Now lets compare this with always betraying.

i | Cain | Abel
--+------+-----
0 | 2    | 2
1 | 4    | 4
2 | 6    | 6

As we can clearly see, in an iterated prisoner's dilemma it pays to be nice. However, note what happens when one always cooperates and one always betrays.

i | Cain | Abel
--+------+-----
0 | 0    | 3
1 | 0    | 6
2 | 0    | 9

It appears that blind optimism that your oponent will always cooperate is faulty. Thus we can conclude a few things about behaving in an iterated dilemma.

There are several more complex strategies a person can apply. Tit-For-Tat(Tip-For-Tap is what it was originally called), there is also Tit-For-Tat with forgiveness, Tit-For-Two-Tat and Grim Trigger. All 4 of these are under the same umbrella of "Trigger Strategies" where if a certain Trigger is observed, they change behaviour. The simplest to demonstrate is Grim Trigger. Cain will demonstrate, and on iteration 2 Abel will betray and Cain will continue to betray for the rest of the game even if Abel cooperates after.

i | Cain | Abel
--+------+-----
0 | 1    | 1
1 | 2    | 2
2 | 4    | 2    <= Abel Betrays
3 | 6    | 4
4 | 6    | 7    <= Abel cooperates, Note that Cain still betrays
5 | 8    | 9    <= Abel returns to betraying, Cain continues betraying

You will observe 2 things about Grim Trigger. Between 2 people, if one betrays and the Grim Trigger is triggered, then that one shouldn't Cooperate because, if we recall, it's a choice of adding 3 or adding 2 to their own score. I will now demonstrate Tit-For-Tat. The Strategy of Tit-For-Tat is simple. Initially Cooperate, and then repeat your partner's last move. We will repeat the above game, with both Cain and Abel playing by Tit-For-Tat, except on the 3rd iteration I will make Abel betray, to demonstrate its behaviour.

i | Cain | Abel
--+------+-----
0 | 1    | 1
1 | 2    | 2
2 | 5    | 2    <= Abel Betrays
3 | 5    | 5    <= Cain betrays, Abel Cooperates
4 | 8    | 5    <= Cain Cooperates, Abel Betrays

We will observe that a "Sort-of" loop or pattern begins to emerge when two tit-for-tat players are faced against each other. It's from this, we can see if it was just an honest mistake or if it was a genuine attack, it ends up going back and force doing neither side any good(they both gain 3 score over 2 iterations, compared with always cooperating which would gain 2 score over 2 iterations).

This leads us to the 3rd principle in the iterated dilemma

The way we introduce this forgiveness can vary. Some people choose to model this with a random percent chance that if a Tit-For-Tat player would retaliate, they would choose to forgive and cooperate. Others simply model it as Tit-For-Two-Tat. That is, 2 betrayals must be made for them to begin betraying. These simply break the cycle of betrayal, replacing it with more cooperation, but they fare poorly against always betraying.

There is a 4th principle.

This can be demonstrated with suspicious tit-for-tat, where they start out betraying. You can work through it yourself, but you will see that the action of betrayal on the first move is enough to trigger bad responses. 2 Of this against itself is bad, and against a tit-for-tat player it will loop.

Damnant quod non intellegunt

From the above one would think that Tit-For-Tat is the optimal strategy, however if we introduce a population with all units following their own strategy and facing off against each other, we will see it isn't always the optimal strategy. I won't cover this in this here because this is more in-depth and detailed, however look at the references section if you want to see more information. For such a tournament, for the 20th anniversay, a Southamptom team proved that although Tit-For-Tat worked and is robust, it is not always the optimal strategy. How? They... colluded. The rules of the dilemma disallow communication, and the different strategies employed by that team, allowed for a ten turn song and dance to be communication(One can think of this much like binary, where Betrayals are 1 and Cooperations are 0. You could also probably embed some kind of communication this way now that I think about it). Regardless, despite beating Tit-For-Tat, and being an illegal action, they technically prove it's not always the optimal strategy thus, I come to a 5th key principle.

What I have listed above will probably strike a reader as relatively obvious and stemming from first principles, however I think it's worth remembering the benefits of what I have listed, with Tit-For-Tat. I would argue that for robustness, it tends to help to follow Tit-For-Tat in your actions, and to communicate, and this is exactly what we don't see in the modern world unfortunately. The breakdown of communication ironically enough by communication devices. Perhaps that is a discussion for another time, about how Social Control from Social Media tends to lead people to follow poor strategies. I would argue the strategies followed by the masses these days are based on some envy-based triggers, with these measurements of how liked or disliked a person or their comments are.

Because this topic is non-trivial there are many more strategies, such as Win-Stay, Lose-Shift, and gradual tit-for-tat. I have thus provided references. I hope this brief look at Game Theory and the Prisoner's Dilemma has been of use to some people.

P.S.

It's worth noting also, the final iteration for a given game is the single iteration of the dilemma. This is worthwhile considering. There is also a continuous iterated prisoner's dilemma where it is not a discrete yes or no, but instead more complicated as a p rson can contribute as much as they want.

References:

=> The Evolution of trust
=> The Prisoner's Dilemma on Stanford Encyclopedia
=> The Prisoner's Dilemma on Wikipedia
=> Trigger Strategies on Wikipedia
=> Link to post, and comments here

Browser choices

There's a lot of browsers available for browsing the modern web. You could probably take something deprecated and outdated and still be able to browse it(selected sites with simple HTML and probably no HTTPS) like netscape navigator. Now in my time on the internet I've used a few browsers, lets roll and see 'em:

- Firefox

- Google Chrome

- Chromium

- Vivaldi

- Opera

- Microsoft Edge

- Internet Explorer

- Brave Browser

- Surf

- Lynx

- w3m

- Qutebrowser

That's quite a number of browsers. A bunch are unusable for the modern web as far as I'm concerned due to significant flaws. Lynx, W3M are examples as they aren't graphical, they are terminal browsers(which is fine if the same site could be viewed in a screen reader... unfortunately the vast majority can't be viewed in a screen reader).

Now lets go through them one by one. Firefox. It does a lot well, but has become effectively controlled opposition in the pockets of the Google monopoly. Sure, it's "technically" not a monopoly, but the vast majority still use the Chromium engine for rendering. Firefox has its own rendering engine, Gecko, and a bunch of forks coming from it such as GNU Icecat. Unfortunately all the forks have issues of either small development teams, development teams that don't support privacy or user choice, or just being plain outdated(GNU Icecat for example is miles behind the current Firefox version).

Oh also Mozilla these days tends to squander money on social causes than on development. "Software developers" my ass.

=> They support censorship by going further than deplatforming. At this point it's not about maintaining autonomy of your platform. Although I agree with displaying who is paying for advertising.

Right Firefox is pretty crap. Good we've come to that conclusion. Next up Google Chrome. I won't waste your time, Google is collecting your data to selectively advertise more effectively. NEXT!

Chromium. Now this is interesting... well not Chromium specifically, it has the same problems as Google Chrome, but its forks "Ungoogled Chromium". Unfortunately, forks based on ungoogld chromium tend to have the same issues as Firefox forks. I won't elaborate too much, but it still implicitly supports the Google browser engine monopoly.

As a quick tangent, lets ask "Why is there a browser engine monopoly?". It's quite simple really, the specification for HTML and the HTTP/HTTPS and related technologies is insanely big. Sadly, a great amount of this is fueled exclusively by commercial interests. The faster the advertising internet is killed, the better our lives will be.

A couple notable mentions. Opera used to be alright. It got acquired by a Chinese Company and went to shit. Vivaldi is actually really good. It's feature-rich and while not doing quite enough for privacy that can be forgiven for pushing the frontier. Unfortunately, it consumes a pretty silly amount of RAM.

Surf uses the QTWebEngine, and is somehow slower than Qutebrowser which we'll come to later. It also has no tabs out of the box or adblocking. Pretty shit, but at least the codebase is small so changes can be easily made.

Internet Explorer and Microsoft Edge... I won't even get into these. Needless to say the former is deprecated and nobody targets it anymore, and the latter is just bootleg Google Chrome that also phones home to Microsoft.

This leaves me with Brave Browser to discuss. For the average person, I will generally say Brave Browser is the best option out of the box. It has financial incentives like its BAT if you want to subject yourself to advertising trash, although that's just a way to gather more personal information if you try to use it at all. Oh and not all of your "Content Creators" will receive a cent of your BAT if you "Tip" them. It'll all probably go right back to the developers. The financial incentives and privacy does seem a bit shady to me, but as the only browser with SANE defaults that requires no configuration out of the box, this is what I'd recommend to the "Normies".

This leaves me with the last browser to talk about. Qutebrowser. Qutebrowser as the name suggests use the QTWebEngine, but it also has configuration files you can write, and is bindable so custom behaviours can be made. These configurations mean you can have vertical tabs, custom per-site CSS, global CSS, colour schemes, and can also pipe the URL or get the user to pick an <a> element as input to some script you have. As a result, I can pipe youtube videos into MPV, or into youtube-dl, or my own custom scripts which I can use to download entire channels, playlists or musics without any issue. This extensibility makes it quite expressive as someone who supports at the very least the extensibility portion of the unix philosophy.

Now someone will ask "What about Privacy". It doesn't phone home apart from when it crashes(in which situation you'll be asked to send a bug report). There are other elements to privacy such as trackers, but even without javascript tracking is possible(recently favicons have been used as a way to track people). However, the vast majority of trackers are in the same box as adverts and this is where the next point comes in.

"What about adblocking?". Qutebrowser used to be pretty lacking in this area, but recently it has improved leaps and bounds, because Brave released their adblocker to be used in other browsers. It also includes blocking based on their own custom "/etc/hosts" file, and with the new Brave adblocking, it's actually usable in more dynamic sitiuations. Combined it becomes very useful and very powerful.

It also supports a great amount of the HTTPS/HTTP/HTML spec. WASM for example is included(though WASM as a technology I still find questionable as it's very much like Flash). It's also very fast(Not the fastest, but certainly not as slow or RAM-heavy as alternatives like Vivaldi).

Qutebrowser is very similar to Vim, with vim-like bindings and behaviours too, 3 modes being control mode for your commands, insert for inserting text, and pass-through for when you need to be able to pass through all input to the browser(so no escaping out of pass-through except via a binding you set).

This all said, not everything is good about Qutebrowser. One point of contention is that it uses Python for quite a bit of the browser. Debatable how good that is as Python is interpretted and hilariously slow as a language(way worse than other interpretted languages like Javascript).

Anyway, if you read this, I suggest Brave Browser as your browser of choice UNLESS you aren't afraid of configuration. If you aren't, then use Qutebrowser, as you get all the benefits of Brave with the benefits of extensibility and configuration.

=> Link to post, and comments here

Who games on Linux? #1: DOOM 2

Game info

- Name: DOOM 2

- Genre: Boomer Shooter

- Games most like it in my experience: DOOM, Blood

- Demo: None to my knowledge. Just pirate it(Yikes!), or buy it on Steam.

- Linux: Use a source port

Thoughts

It's a fun game, but some levels are a slog. My worst nightmare in this game is a line of chaingunners ready to ruin my day. The level in particular I found a challenge was "The Catacombs" as it does exactly that. Chaingunners and Revenants all around you. Another enemy that sucks is the Arch Vile but mainly because it covers your face with fire. Lost Souls get more annoying with the pain elementals spawning them, but it all adds up to the overall experience and priorities you set.

On the note of modding, I did mod this game to play with Brutal Doom. It made it a bit of a more satisfying experience. I also played with hardware rendering at a low resolution upscaled without texture filtering for that crunchy pixelated feel. It's quite fun. Also I played on Ultraviolence difficulty(Nightmare, from what I hear, is a joke difficulty that targets the extremest players).

There's not very much to say about it that hasn't been covered elsewhere before. Some levels are very clearly experimental. "Barrel's O' Fun" is another experimental pain in the ass. Another of these experimental pains is The Chasm. Now the Chasm isn't all bad, but with the fast movement of DOOM and thing platforms to move on, and lots of Cacodemons it's a bit of a pain in the ass.

Overall the game follows a similar downard trend to how I felt in Doom 1. The first episode was great fun, the second was alright, the third was okay and the final one is a bit of a slog. In this case, the third and fourth cases are swapped around. One of the levels that really impressed me from the final episode was "The Living End".

Some levels are a bit of a Key and Switch hunt. One example being "The Spirit World" where you have to shoot a specific uninteresting wall for progressiont o be possible.

As for difficulty, apart from levels that used a lot of Chaingunners, I didn't find it too hard. The first 2 levels of Doom's episode 4 were by far harder than the levels in Doom 2.

My final verdict is that it's fun, but becomes a slog towards the end with less interesting levels. I imagine community-made maps are better. DOOM 1 had better levels overall in my opinion. Similarly it'll entertain you for a good 5 or so hours, unless you try to 100% the game, or unless you mod the game to play more community-made maps.

Answering the question of "Who Games on Linux?" here, "I do".

Final information

- Played Version: N/A. Used GZDoom and Brutal Doom

- Linux Compatibility: Use a Doom source port with Linux support(GZDoom)

- Hours played: 5 hours

- Will I return to it: The ID Software levels? Unlikely. For mods, probably.

=> Link to post, and comments here

Impressionism and Games

This is a short blog post, that I hope may illuminate to some people why I prefer some games over others.

The objectivity and academic study involved in game design exists. It's respectable, definitely, however I can't help but feel that from these objective outlooks, something is lost. I also can't help but feel these objective outlooks are designed purely to get the most bang for your buck. Not every game needs to get the most bang for your book- and in this article I hope to elaborate on impressionism and how it relates to games and their game feel.

One of my favourite games as of recently would have to be Teleglitch. I learnt about it some time ago watching Israel Blargh. It's a truly fascinating procedurally generated game, where the layouts are random, but the modules are pieced together. I tried to explore what makes it good through my clone of it: Quiver. Naturally, what makes it good is a hard to describe emotion and constant feeling of being cornered, low on ammunition and with a very low liklihood of survival. A classic "You are all alone and fucked" scenario.

Now impressionism, is a genre of Music that isn't focused on the specific chord progressions or melodies of the music, but rather focuses more on the feeling of music. Clair De Lune is one such example of this music, and it nails a hollow sort of lonely feeling. Impressionis, doesn't just apply to Music though, there have been impressionistc classical art pieces where the use of colours and shapes have been used to convey a specific emotion.

Now, when I play games, or consume any art for that matter; one of the most important elements is that it can strum the strings of emotion. One of the common emotions felt in games is feeling badass and to be fair, this is common because so many games give out a power fantasy. This is usually easily achieved with mechanics that feel good, with extra polish to make the reward pay off. Of course, this has it's pitfalls as the wrong fantasies can be indulged- fantasies of finance by Greed often promoted in modern games. I don't blame the Developers for this, it's an appreciable artform itself psychologically manipulating a person to feel specific emotions. I also don't blame the players, psychological manipulation is incredibly strong, especially against willing participants. Now I say psychological manipulation and that's because it's used for good and for bad. The bad is covered in far too much detail by the far too many failings of modern AAA game developers(to some extent I think it's a coping mechanism created by a huge team. Huge risks multiply across the development team, and bread comes first). Of course the good is that a game can capture a specific feeling or emotion(and in some cases, multiple of them, multiplying the effect of the game).

Two common examples of games that do this well are Cave Story and Undertale. Their emotional content is plucked from well-woven mechanics and narrative storytelling.

I'll give a few more examples, Space Beast Terror Fright, a game that almost perfectly captures the feeling of being doomed with all odds against you like in the movie "Aliens" is one such example of indulging both the power fantasy and fear. You don't know quite what you're going up against, but when you beat it, you feel incredible.

I'll tie this back around. A lot of the greatest games are simple, as they have a specific emotion they seek to capture, and an audience that is perceptive to that emotion will enjoy it thoroughly if it does it well. RPG Players enjoy strings of emotions by mechanics woven with narratives. Boomer Shooter players enjoy feeling like a badass, with a light sprinkling of flavour emotions. Roguelite players like the random roll of badass and crushing defeat(this is pretty varying depending on how much variation exists in the power curve. Teleglitch has very little, Binding of Isaac and Enter the Gungeon have a lot). Portal is also much like the RPGs where it mixes narrative and mechanical storytelling.

So I'll leave this with what I think should be done in most games. There's often the claim of polish being said. Now I would take that a step further and say that polish should be prioritised. General polish doesn't matter as much(it will be experienced, but it won't be as distinct), but polish in the areas of greatest emotion will go a long way. You need to make the rewards and punishments truly distinct and aligned properly with the emotions of your game. It's also for this reason I would say you should focus on achieving one or two emotions really well. Stringing them together is non-trivial and requires a lot of storytelling(as is the case in one-hit wonders like Undertale and Cave Story). Also there's a huge variety of emotions, and a lot are mostly untapped. Confusion and comedy are 2 examples that aren't tapped very much, the former best demonstrated in dream-like games such as Yume Nikki, and the latter best demonstrated in games like The Stanley Parable.

Hope this short article about impressionism in games was of help to some people. As usual, these are my opinions, yours are wrong, but mail me if you have some anyway.

=> Link to post, and comments here

Who games on Linux? #0: HROT

Game info

- Name: HROT

- Genre: Boomer Shooter

- Games most like it in my experience: Quake, Dusk

- Demo: It has a demo on Steam, go play it. It uses V0.2.9

- Linux: It works. Needs an additional commandline to run with audio. Command line argument is

WINEDLLOVERRIDES="openal32=b" %command%

Thoughts

This weekend, I purchased and completed all of Episode 1 of HROT. A fascinating game that nails the boomer shooter genre pretty well- despite some issues. I will say firstly, it runs well under Linux, but with the Software Renderer you do run into some framerate issues every now and then. I don't know if there will be a Linux-only version in the future, I certainly do hope there will be one as it will be very interesting.

Anyway the issues I found with the game? Firstly, the sound effects. A great deal of the sound effects aren't suitably meaty. Secondly, the "Super Shotgun"(I say in quotes as I don't know its proper name in the game), is great, but it has an issue with the model where it violates the Pauli Exclusion Principle. It also is a bit too strong in the game, which leaves the normal shotgun retired by the time you get it unless you run low on ammunition(which doesn't happen for your bread and butter Shotgun ammo).

Now that I listed my gripes with the game, lets get into what's weird that may irritate some people. Firstly, it's very brown. Really brown. This isn't an issue to me, but it may be an issue to people who the constant monotone brownness of everything isn't interesting. Despite this, brownness which some may think is a parody of Quake, I think it's for the purposes of aesthetic and ambience, setting the game as very brutalistic much like the architecture of the same name. Oh and god-damn! That's a lot of sewer levels or levels with sewers.

In fact, the 2 games this reminds me most of, are STALKER and Quake, but the movement is not at all Quake-like. There's no bunny hopping or rocket jumping or anything hyper fast of the sort. It's all relatively toned down. The STALKER influences are definitely from its russian influences. I say Russian influences, but from what I am aware, the game is set in 1980s Socialist Czechoslovakia. The tones throughout the game hint at some external event that changed things within the slavic sphere of influence(Likely chernobyl in this game's case), but it certainly makes a good parody of the Czech politics of the 1980s.

Moving on, the secrets are good. I didn't 100% each level, but there'll be plenty here for people to play. Some enemies in my opinion saw too little use or didn't seem to do very much in the way of attacks: the Pedros, little walking pig-like spitters weren't seen very often, but seemed pretty cool(I think there's more rats than Pedros in the entire game). There's also the Gas Mask Horse which saw only 3 appearances. 2 in the Castle and one in the final level. One of those in the castle will literally drown, so there's just 2. Regardless, this is just a minor gripe with regards to some enemies not being used enough and some being used a lot. 3 are pretty simple, you got shotgunners, SMG/Pistol users and a fatass police offers. The latter is almost definitely just a copy of the Ogre from Quake but the rest are pretty good.

The maps are also very well made. I only had to go door hunting a few times after I acquired a key, to find where it was used, that's the only issue there. It's certainly not like in some maps of Blood or DOOM 2 where it devolves into a switch/key hunt.

There's also a motorbike. I'd like to see a motorbike sequence, as in its current state it's not really usable. Maybe we'll get one in E2 or E3, who knows.

As a software developer myself, I will also note something of interest about this game. It uses its own engine written in Pascal, with both software rendering and openGL rendering available as options(though the openGL options looks ugly at the moment due to texture filtering). This is a fun curiosity as you don't see many games using Pascal or any non-C-based engine these days.

The selection of weapons are alright, nothing spectacular or lame. It's the arsenal you'd expect from any boomer shooter to be fair. You have a sickle(Not as a Dusk throwback, but because of the Sickle and Hammer). You have pistol which you can upgrade to be akimbo. You have an SMG, Pump-action shotgun, Doubled Barrelled shotgun, rocket launcher, grenades, land-mines, lightning gun and some other weird BFG-like gun. The latter I didn't get until the last boss and didn't use it. The lightning gun is like Quake's Lightning gun and is suitably badass. Land-mines are pretty mediocre, didn't use them much. Kind of mirrors my use of similar mines in Half-Life. Also Grenades are bound to right click. Weird change, I did accidentally trigger it a few times, but it's alright.

My final verdict on this is that it's a fun game. Certainly a change of pace from your classical highly polished boomer shooters like Ion Fury, or Amid Evil. It's quite inspiring actually, making me want to return to my game Quiver and rework the entire thing to be a Quake-Style variant of Teleglitch. It will certainly keep you entertained in its current state for a few hours. Give it a year or less, and it'll probably entertain you for a good 10 or so hours. I suppose Teleglitch also has a similarly brown aesthetic.

I'll leave this short issue of "Who Games on Linux?" here, answering that question with "I do".

Final information

- Played Version: V0.3.0

- Linux Compatibility: Proton-5.21-GE-1

- Hours played: 2.6 hours

- Will I return to it: Very likely if I don't forget

=> Link to post, and comments here

2007, the year the internet went to shit

2007 is 14 years ago now. 14 years ago, things have gotten so much worse. One could go so far to say that Humans are a treacherous race betraying themselves every day...

There's a relatively common joke floating around that 2007 is the year the Internet went to shit. I don't personally know what year it went to shit, but I will list the events of 2007 that mark why this could be seen to be the case:

- 1st Generation of IPhone

- FoxNews does its first report on 4chan

- Tumblr is founded

- Facebook allows non-college emails to register

- Twitter explodes in popularity after SXSW Festival marketing campaign

- Big Bang Theory debuts

I'll cover each in some detail. Firstly the 1st generation of IPhone. Relatively speaking from a hardware standpoint it was revolutionary as it pushed smart phones as the frontier of media consumption and use. It also made it incredibly simple to use. As a result it opens the floodgates for idiots to access the internet. There's also the fact that Apple does this all off the back of child labour, but don't worry too much about that. Any competitive technology company violates human rights in innumerable ways, and advocates them in western marketing scams like the eternal corporate hypocrites they are.

Secondly Fox News doing its report on 4chan. I am relatively surprised they even did reports on what is effectively a generic internet forum. Regardless, this just opens the floodgates for edgy kids to pollute the threads and posts. I would generally say this wasn't as bad as 2016 and the few years leading up to where a lot of message-sharing sites became vitriolic with political nonsense. Left vs Right and "Us vs Them"-isms are only deserving labels for people who have no sense of nuance.

Thirdly Tumblr is founded. I think this explains itself, but it sets itself up as an area for degeneracy. This can quite simply be seen from the Tumblr porn ban. I am not pro-censorship, I am certainly pro-autonomy; as such I believe it's Tumblr's right to determine what content they allow and disallow on their site. The fallout of their decision shows just how degenerate the Tumblr userbase is. Naturally of course, a great deal of these degenerates migrated to Twitter- arguably worse due to the use of curt messages and spamming "^" messages in the form of retweets and likes. Remember that "^" was considered bad practice and disrespectful on old forums as it would both, clog up the thread and contribute nothing.

Fourthly, Facebook allowing non-college emails to register. Read into Cambridge Analytica yourself. I won't elaborate here.

Twitter exploding in popularity. Now this one might need some description. Firstly, 140 characters is far too few to adequately communicate. Human communication is by nature incredibly messy and hard to get right, and no matter what string of words you put together somebody else will pattern match that string to mean something you didn't intend. There's also the nature of threads being non-linear, which makes it very hard to go through and see what the discussion leads to. Also the fact it rewards "Slam Dunking" people with witty curt responses makes that a standard behaviour on Twitter.

Finally, Big Bang Theory debuts. I put this one here as I would generally say this is what has normalised and popularised "Geek" Culture. Geek culture is quite distinct from nerd culture, being that it tends to focus on shallow elements, commonly revolving around some brand a person consumes. Nerd culture is quite simply a love of learning to a relatively autistic extreme. One can see, that one answer is a hedonistic answer. One is a poetic worldview(poetry originates from the Greek Word "To Create/Make/Produce"), so one creates. I will also mention that there is a 3rd point being the ascetic answer, which is by nature a reaction to hedonism. Often times though it's a circlejerk, one ascetic culture is that of veganism.

Anyway, I leave this short article off as just me looking back at 2007, as possibly a year the internet went wrong. In general though, it has gone wrong mainly due to commercial interests. A decade ago, you could find hobbyists using Google Search but it has gotten harder with time, as the commercial and advertised answer is more important. The bottom line is more important.

Think about it for a moment, that the medium of information exchange that so many people rely upon, are built on obtuse AI Algorithms that in the case of Neural Networked Algorithms, even the designers don't fully understand- and that they are trained as such to push the commercial interests and commercial informations over the useful information(I would also think this goes along with the concept of Intellectual Property being a scam. A lot of great works today exist only because people disrespect the idea of "Intellectual Property", because knowledge is not a zero-sum game. Sharing knowledge is in Human Interests). There are a number of other events that have taken place that could be pointed to. The use of Wikipedia is a de-facto expert over actual experts for example. The use and advertisement in search engines. The numerous privacy violations by the NSA, GCHQ and other "National Defense" groups(This is ignoring that by nature privacy is a pretty vacuous term on its own, and needs elaboration to be useful. The strong form of the "Nothing to Hide Argument" shows this).

"Some people think this is paranoia, but it isn't. Paranoids only think everyone is out to get them. Wizards know it." - Terry Pratchet

=> Link to post, and comments here

Unix philosophy and writing scripts

What is the Unix Philosophy?

The Unix philosophy is a design philosophy for programming and scripting. I will list its rules here, but one can do more reading themselves on this if you want. The reason the Unix philosophy is even relevant today, is because it makes the standard scripting experience on any GNU/Linux system(Linux being a Kernel, GNU standing for GNU's Not Unix, and both are inspired heavily by Unix).

Some of these are obvious. Some not so obvious and some are rather specific. It's worth noting that these principles can be applied to systems that aren't Unix-Like but it may be harder. Anyway, the focus of this blog will be on simplicity, clarity and modularity. I will discuss how to write a decent shell script.

Writing shell scripts

Your shell scripts will begin with the line

#!/usr/bin/sh

What this shebang does, is it tells the script, which shell to run. sh, is usually symbolically linked to bash, but maybe to some other shell like zsh or fish. It's good practice to use sh rather than bash unless you use bash features in which case you would use the shebang #!/usr/bin/bash. Other shebangs are available for other scripting languages, but today I will focus on shell scripting.

To help demonstrate this, I will be using my blog writing script to show how I write a shell script.

# Variables
blogdir="content/blog/";
rssfile="../html/content/rss.xml"
year="2021"

Lines beginning with a hashtag are comments. Lines with a word followed by an equals(NO SPACE BETWEEN THE =) will be a variable. You will observe I used a semi-colon. In shell scripting it's good practice to end lines of code with a semi colon. There are cases where it has a useful meaning, but I won't discuss them here.

echo "Enter a blog title"; read -r blogtitle;
blogtitle=`echo "$blogtitle" | sed "s/ /_/g"`
blogtitle="$blogtitle.gmi"
st -e nvim "$blogdir$year/$blogtitle";

Here I call the program echo. echo can take a number of flags, but I don't use them. You can look these flags up in a terminal by typing man echo. For this, in this line, I echo to the user to enter a title, then I call the program read, with the -r flag and a variable name that will be used. This allows me to get user input straight from the terminal. I could use dmenu or some other program if I wanted to, but I chose not to.

The second line introduces three new concepts, a pipe and the backticks `. I will cover pipes and variables first.

echo $blogtitle; | sed "s/ /_/g"

At first, I call echo. The $ means a variable. You can then type a name after the $ to mean a specific variable, in this case blogtitle, which was previously set by our read. The standard output of this program is piped into sed. Piping is a way of passing the output of a program as the input to another program. The program sed is a stream editor, so it allows me to modify streams of data and output them. The string I pass into it is how I want to modify it, it's pretty similar to Regex.

"s/ /_/g". The s means substitute. the / means to go to the next part of the parameter. So I will be substituting spaces. The 2nd / means to look at what I will substitute it with, _. the final / and character tells me I want to apply it globally to the stream, and not for the first instance. As a result, this will replace all space characters with underscores(useful for writing files without pesky spaces!). sed is a pretty useful and complex program, look at the man page by typing man sed to find out what it does.

Just from this small demonstration you can begin to see how the rules of composition, modularity and seperation are all being used here. Each of these programs handles their own logic as a black box, and we care only for the output to compose the outputs together as a program.

Now lets look at the backticks. What this does, is allow us to evaluate the contents in the backticks, and assign its output to a variable. As such, the blogtitle has been assigned to be itself but with spaces substituted for underscores.

We can then assign it to have a file extension by just using the variable in a string, as "$blogtitle.gmi". I then call st which is my terminal emulator to create a new terminal with neovim so I can write the article. A note about quotation marks. backticks(`) are for evaluating commandline expressions. Strings(") are where you can use variables and text and assign it accordingly, and single quotes (') do not allow the use of variables, and are pure text. These 3 quotation marks make it relatively easy to use variables appropiately.

I won't go through in detail the rest, but you can get the idea of how it is useful. I will write the rest of the program below and point out other noteworthy lines.


#!/bin/sh

# Variables
blogdir="content/blog/";
rssfile="../html/content/rss.xml"
year="2021"

# Read the name of the blog and make a file and write it
echo "Enter a blog title"; read -r blogtitle;
blogtitle=`echo "$blogtitle" | sed "s/ /_/g"`
blogtitle="$blogtitle.gmi"
st -e nvim "$blogdir$year/$blogtitle";

# Recreate the blog directory
yourTitle=`head "$blogdir$year/$blogtitle" -n 1 | sed "s/# //g"`
head "$blogdir/blog.gmi" -n 6 > "head.t"
echo "=>$year/$blogtitle $yourTitle" > "link.t"
tail "$blogdir/blog.gmi" -n +7 > "tail.t"
cat "head.t" "link.t" "tail.t" > $blogdir/blog.gmi

# Now we set it up for RSS

cat "$blogdir$year/$blogtitle" | sed -z "s/\n/<br \/>/g" > "rss.txt"
echo "]]></description></item>" >> "rss.txt"
echo "<item><title>$yourTitle</title><description><![CDATA[" > "rssA.txt"
cat "rss.txt" >> "rssA.txt"
rm "rss.txt";
mv "rssA.txt" "rss.txt"

head -n 10 "$rssfile" > "head.t"
tail -n +10 "$rssfile" > "tail.t"
cat "head.t" "rss.txt" "tail.t" > "$rssfile"

# Clean up temp files
rm "head.t" "link.t" "tail.t" "rss.txt"

The comments for it explain it well. The > character means to write to a file. This overwrites everything in it. The >> characters means to append to a file so it writes it at the end. All of the above is me writing to temporary files and composing a structure from which to use and then putting them back together.

You will observe a number of programs not mentioned before. Firstly is cat. cat takes a number of flags, but it allows me to just output the contents of a file. Some programs have flags for taking file input though.

rm, removes files or directories, useful for deleting these temporary files. mv moves a file or directory to a new location or just renames them. head takes the first n lines of a file. tail takes the last n lines of a file.

All the above programs can be read and understood better by READING THE MANUAL. There is a co mon saying called RTFM which stands for Read the fucking manual. Quite simply, a lot of the flags you need for a program are listed there.

This has been a brief look at shell scripting. If you understand variable assignment, command execution and the programs you use in it, it's very easy to compose complex nuanced scripts and behaviours from simple and dumb programs.

Thanks for reading!

P.S. A lot of the Unix Philosophy is mainly relevat to writing small C programs to be used in shell scripts as I've demonstrated above. This is just us taking advantage of a system using the unix philosophy. As such a system like Windows can't take advantage of these ideas as much because it wasn't designed with this philosophy in mind. Also for large or highly coupled programs it also tends to fall apart, so it's not always applicable.

And remember, there's never one true way to write scripts or programs universally. It's up to you to determine the best way from your current knowledge and experience.

=> Link to post, and comments here

Stories page

I have now added a stories page to my website. My intention here is to use it as a page to host stories... obviously? Anyway, the content you will find there is content I have rights to put there, either directly from the author, or by nature of it being Creative Commons or Public Domain... or by nature of me holding the rights to it.

If you want to submit your stories, feel free to, but keep in mind that by doing so you allow me permission to use it following CC BY-NC-SA 3.0. That's not for all people, just for me so that I can have rights to put it up on this site without any commercial application of it, and so that I can format it appropiately as it will be relatively close to plain text.

So far I have uploaded one of my own stories, as well as one of the short stories I have enjoyed by H.P. Lovecraft.

P.S. I will start signing at the end of these articles the date it was written. This is more so I have a semblence of time passing between each article.

=> Link to post, and comments here

Automating Tmux for fun and profit

What is Tmux?

First what is a MUX? A MUX is a shorthand for Multiplexer which is used for selecting data commonly in electronics. TMUX is a Terminal Multiplexer. This means it selects Terminals to act on, as well as managing them into panes so you can easily access them. They also save the terminals in sessions, sessions can be thought of like your workspace. This means it can be used for really a lot of things actually. I use it for hosting servers without forking the process or for having processes run in the background in a tmux session rather than in a window(typically this is done with ncurses based applications as they display in your terminal). Another advantage is you can use it with vim or neovim(probably emacs too), so you can create a workspace for doing your text editing activities. In this I will provide examples of some scripts I have written using tmux, as well as some advantages. I won't be going over common keybinds and how to use tmux as a user, so if you're a newb to tmux, take some time to look for examples of using it. If you use tiling window managers, but haven't used tmux, you will feel right at home with tmux.

=> This blog is a very good jumping off point into using tmux

Running programs in the background

One of the TUI(Terminal User Interface) applications I enjoy thoroughly is cmus(A music player written in C and NCurses). There's a lot of reasons to like cmus, it's easy to use, it can be used on a network, it can be controlled remotely with cmus-remote(which I use with my keybindings), however for all these advantages, it's not a daemon so it doesn't run in the background. A different music player like mpd might be suitable for this, but I chose to stick with cmus and write a script with tmux. Below is the script

tmux has-session -t cmus 2>/dev/null
if [ $? != 0 ]; then
        tmux new-session -d -s cmus \; \
                send-keys 'cmus' C-m\;
fi
tmux attach-session -t cmus

What this does it firstly checks if it has a session named cmus, and writes the result to to /dev/null. I then do a check on the result of the last command's output ($? is a mysterious variable that is the output of the last command you wrote). I check it's not 0, a 0 would mean it has a session and I should attach to it. If it's not a 0, I create this session. The -d flag being to detach after creating it. The -s flag being the session name. You will observe I split this over multiple lines, when a tmux session is made I can send it keys, with send-keys. So I tell it to stard cmus, and C-m(This enters). This starts a Tmux session with cmus running in it, detached. The final line is to attach to the session cmus(you can omit this if you want to manually attach to the session. I keep it because I use this with a keybind of mine to bring cmus up when I want to listen to music).

Read the above example, and you will learn that the first advantage of tmux is being able to run, detach and attach sessions, as well as send arbitrary key inputs to run commands accordingly. I would use this to run scripts in your $PATH.

Managing windows

But what if you want to have windows? I will ignore this for vim, because this has windows by default, you can look that up yourself, and you will find out about it, but some applications you want to do windows. For a short time I used to use a task manager called taskwarrior for doing todo lists(I don't anymore, I use a physical journal, but the code I wrote is still applicable here).

tmux has-session -t tasks 2>/dev/null
if [ $? != 0 ]; then
        tmux new-session -d -s tasks \; \
                send-keys 'while true;do clear; task priority limit:3;sleep 5;done' C-m\; \
                split-window -v -p 75 \; \
                split-window -h -p 30 \; \
                send-keys 'while true;do clear; task burndown.daily;sleep 5;done' C-m\; \
                select-pane -t 1 \;
fi
tmux attach-session -t tasks

What I did in the cmus example is repeated here and built upon. I run some commands in a loop(could be replaced with a script or executable of your choice. Maybe a crypto price watcher by curling rate.sx?). So I split the window vertically by 75%, then I split the window horizontally by 30%. This will now select the 1st pane as where you handle inputs, while the 2nd pane and 0th panes are handling your scripts. What this allows is running a lot of different applications in a very specific window layout(almost like a tiling window manager...?).

I have exhausted my supply for tmux examples that I wish to demonstrate, but you can see that if you make a script with a case statement for different tmux layouts this could be useful for some people. I will personally recommend looking at the man page for tmux to see more ways this can be used, because I have barely scratched the surface.

Anyway, other uses include when you ssh into a remote server. Why? Because if you lose connection, your terminal state isn't saved normally, however if you lose connection in a tmux session, you can just reconnect and reattach to the tmux session to pickup where you were before.

I hope these examples have illuminated why I would suggest using tmux for automating your workspaces. In some cases it's not worth this effort, but for others it is. I will let you dear reader, determine what is and isn't worth the effort to apply tmux in.

=> Link to post, and comments here

Fully replaced with gemini

Hi all reading this. I have fully replaced my HTTP website with its Gemini equivalent. I still use my normal Stylesheets, but what you will read on this site is now almost perfectly mirrored with the Gemini site(except for the plaintext being a little bit obtuse due to using HTML). I have also got a mini-log thing, where I will post small logs or updates. I still need to think through how to set up my RSS feeds. I will probably set one up for the blog and one for the mini log. I may also include a log for the UTC News archive, but I am not sure if I will or not yet. I say that because the shell scripting for that is messy. It works!... but it's messy.

I may do a video or two on Gemini. How to set up your Gemini Pod... Client options... how to mirror it to HTTP as well. I have included a link so you can see how I went about porting it to HTML. It's really just a script looping over all files, and awking the hell out of it. These scripts are licensed under MIT License... so go nuts. Cheers for sharing them in the Gemini Mailing List Martin.

I am also considering doing some tutorial videos on Godot. Mainly videos where the purpose is breadth and speed. We shall see though.

Ultimately, I am very happy with this new system I have created. Pretty much as I said before, I only need to set up an RSS system. I think the way I may do it, is just to write a script which puts it at the top of the RSS feed... Perhaps... We shall see.

=> Here is the source for how I ported it.
=> Link to post, and comments here

Gemini: Antidote to the modern advertiser web

=> You can find the video I made about it here.

In short, I think Gemini is a better alternative to HTTPS for document exchange. For web applications I hold that HTTPS is better, but web applications are usually not about document or information exchange, at least not useful documents or informations.

I may do some more videos on Gemini, I don't know.

=> Link to post, and comments here

Happy New Year!

This is just going to be a short blog post detailing some basic information of what I'm up to. I'm currently in my 3rd and final year of Computer Science at University. My intentions going forward to continue to put a lot of my efforts towards focusing on completing this with a relatively good grade. I'm aiming to get a First Class Degree, but if I fail, it's relatively likely I'll just get a 2.1/2.2 degree. Regardless, I have been working on my dissertation which will hopefully have an end product being a playable game for people to play. I am also hoping I will make another short playable game at the start of February made in a week or so, but you know well enough by now how my plans go seeing the current status of Quiver. There is a very real possibility that I repurpose a lot of what I have in that to making a "Boomer Shooter" interpretation of the Game "Teleglitch". We will see though.

I am also considering creating some tutorials focused on the Godot game engine. I suspect I could probably create a fair amount of useful tutorials for Godot, as it's relatively easy to use, and I feel it is better for a lot of game development than Unity or Unreal which are both typically overkill unless you have a very good reason to use them.

Anyway, here's hoping this year is a better year than this previous year. I don't have anything interesting to say about lockdown or Coronavirus.

=> Link to post, and comments here

Articles from blogs I follow around the net

...

pt. i pt. ii

via I'm not really Stanley Lieber. September 17, 2021

Status update, September 2021

It’s a quiet, foggy morning here in Amsterdam, and here with my fresh mug of coffee and a cuddly cat in my lap, I’d like to share the latest news on my FOSS efforts with you. Grab yourself a warm drink and a cat of your own and let’s get started. First, a new…

via Drew DeVault's blog September 15, 2021

Help Archive Team Archive public Google Drive files before September 13!

On September 13, Google is going to start requiring longer URLs to access many Google Drive files, breaking links to public files across the web unless users opt out! Because…

via Data Horde September 11, 2021

Generated by openring