How do high speed trading algorithms work

The tyranny of algorithms

The American author and IT professor Ian Bogost asked for an interesting thought experiment last year. The next time you use the term "algorithm"
hear, one should simply replace it with "God" and observe whether the meaning of the sentence has changed significantly. Bogost's point: Nowadays algorithms have replaced fate and abolished chance. They shape our digital existence. In 2016 we are living in a computer theocracy.

The filter algorithms of social networks and search engines determine what billions of users worldwide see on their screens. Based on our user profile, they recommend what we buy (Amazon), which direction we are going (Google Maps), who we go out with (Tinder) and how we should dispel our boredom (Netflix). With every click of the mouse, every newly installed smartphone app and every signed end user license, we pave the way for a future in which software, sensors and computers supposedly make our lives better and more efficient.

Algorithms control stock trading, they compose music, paint pictures, write newspaper articles and decide on loans. You will soon be driving cars on the streets, and in the meantime some algorithms are already programming the next. Human action? Hardly necessary anymore. Do algorithms even have the say over our world?

Transformation and visualization of the source code of the website www.google.com © Christian Riekhoff / Science Photo Library

Almost mythical power

So Ian Bogost is right. Nowadays the algorithm is accorded an almost mythical power. For the normal user it is opaque, it explains the wonders of the digital spheres. The concept of the algorithm is actually trivial. It means nothing more than a sequence of predetermined steps. In a very simplified way, you can imagine them as cooking recipes: Take this and that, cut and chop it, heat it up, let it simmer, voilà!

In this example, the ingredients are data. For example the name of a user, their location, their habits, online and offline. Computer algorithms determine which calculations should be carried out when, or convert images, videos and news articles into data packets that are sent at high speed to their destinations on the Internet.

The problem now is that an imbalance of forces has arisen. At some point in the last few years we have transferred our autonomy to ever larger parts of supposedly life-enriching algorithms that make decisions for us. The internet companies know more and more about us, while we understand their means less and less.

So it is no wonder that one can often read that we live in an “algorithm culture” or even suffer from a “tyranny of algorithms”. That sounds almost like in the well-known dystopias of science fiction. There is the topos of a dark, unauthorized force that is inherent in machines and that comes to light at some point with the terrifying intention of subjugating humanity.

Judging machines

But even in reality, writes Internet sociologist Zeynep Tufekci, we have long been at a time when algorithms can instill fear in us. The world is at the beginning of an era of “judgment machines. Machines that not only calculate the fastest way to sort a database or solve a mathematical equation, but also decide what is good, relevant, appropriate, or harmful. "

In 2016, the algorithms had a life of their own on the Internet and were seldom consciously noticed by the users whose lives they determine. They only become noticeable when they don't work as intended. The result can be absurd. Like in 2011, when two automated trading algorithms outbid each other on Amazon to determine the most profitable price for a book. In the end, the programs put the book, which deals with evolutionary biology and fruit flies, at just under $ 24 million. There are many of these examples where the algorithms inadvertently go wrong. And sometimes they are heartbreaking. Last year, for example, Facebook's automated annual review presented the web designer Eric Meyer with a photo of his daughter, who recently died of cancer, with the signature: “It was a great year! Thank you for being there. ”The algorithm selected the image because it received a lot of likes. The tragic content, however, was not relevant for the selection of the system. More resigned than angry, Meyer wrote at the time: “By definition, algorithms are thoughtless. They only simulate decision-making processes; once you start it, there is no longer any thought. And yet we let go of these thoughtless processes on our lives. ”Algorithms can also be cruel, unintentional ways.

Is Resistance Still Possible?

So what can you do? How can you defend yourself? Or better: is resistance even still possible? “Nobody should become the object of an algorithm,” wrote Justice Minister Heiko Maas in a guest article in “ZEIT” last December. It also says: “Every algorithm is based on assumptions that can be false or even discriminatory. We therefore need an algorithm TÜV that guarantees the integrity of the programming and also ensures that our freedom of action and decision-making is not manipulated. "

Maas is not the only or the first to make such demands. Viktor Mayer-Schönberger, lawyer at the Oxford Internet Institute and author of the book “Big Data - The Revolution That Will Change Our Lives”, calls for an “environmental impact assessment” for new algorithms. Just as when building a new power plant, care must be taken to ensure that the effects on the environment and residents remain within tolerable limits, so should this also be the case with software. Except that in the case of the algorithms distributed globally on the Internet, the environment is the same as the whole world - and the residents are the total of three billion users.

The desire for a central control authority is all the more understandable. In this way, states and communities have managed, or at least managed, obscure problems for a long time. There are only three problems.

First, their algorithms are Internet companies' best kept secret. With them and through them they earn their money. IT corporations such as Google, Apple or Facebook will use all means to defend themselves against having to disclose their super recipes or even have them regulated. Second, the algorithms are now far too complex to be understood by laypeople at all. Up to 100,000 variables influence which content can be seen where in the Facebook news feed. "People overestimate the extent to which IT companies understand how their own systems work," says Andrew Moore, dean of the computer science faculty at Carnegie Mellon University and Google Vice President until a year ago. Thirdly, a computer algorithm is unfortunately not a power plant that - once built and put into operation - stands on the green field and emits in front of itself. As is the case with software, the algorithm is volatile, is constantly being improved, and is subject to constant change. Google alone changes its search algorithm several hundred times a year - without the users being aware of it.

And what's the alternative?

If Heiko Maas has his way, a cumbersome authority would step in every time and ask for a revision - anyone who demands something like this has not understood the Internet. Participation in the global information network nowadays means inevitably exposing yourself to the rule of algorithms.

Ultimately, the tyranny of the past, wrote the science fiction author Lee Konstantinou quite appropriately, is above all a tyranny of the past over the present. Reduced to bits and bytes and often taken out of context, yesterday dictates what will happen today and tomorrow. The only alternative is to switch off.

The article was first published in the supplement to “taz. Die Tageszeitung “on February 27, 2016.

MaerzMusik - Festival for Time Issues 2016 will take place from March 11th to 20th, 2016.

On March 12, 2016, the performance of Annie Dorsen's “Yesterday Tomorrow”, the durational performance “The News Blues” by Nicholas Bussmann and the program “Algorithmic Composition” will be dedicated to the compositional collaboration between man and machine.

Categories: Time Policy Series: Time Issues