Home
Blog

Technology in service of world-scale human-based information renormalization

April 3, 2021

One of the thoughts that have been going in my head recently has been "technology/science has made faster progress than our values". It's a fun thought (I think I heard it in one of Jordan Peterson's lectures and he was qctually quoting somebody else). Changes have happened faster than we had the chance to really understand them and properly scope their consequences. For example, a lot of people are for that reason trying to scope AI/ML before it too gets out of hand.

There's also another interesting thought, that "life exists because the law of increasing entropy drives matter to acquire lifelike physical properties" (highly likely I got thinking about it after reading this article A New Physics Theory of Life).

We are constantly battling against increasing rate of entropy increase. And since literally all of our endeavors are about parsing information, extracting useful bits, modelling processes and optimizing for those, i.e. they are all optimization processes working with information as the primary substrate (or one of the potentially many equivalent substrates), it's not illogical that one of our main goals is to just be able to work with more information.

Networks are great at extracting important bits of information, because their non-linear effects will multiply importance and help relevant bits be spotted easier. As an optimization algorithm, social networks are a global space search algorithm, that actually converts to some weak, unfair version of local space search algorithm at a later stage.

It's interesting to think about technology as the mechanism that is supposed to help us parse all that information that is being generated. All the social networks, suggestion/recommendation algorithms, timelines and feeds, likes and votes, are there to help us narrow down the information towards what we consider useful.

We humans are the main part of the renormalization and filtering that we're executing over the world's data, but the technology is a glue for all of that and it's trying to amplify our ability to process larger amounts of data.

Every piece of software is doing a step in multi-level renormalization filter, that is doing lossy compression of overall data and removal of supposedly unimportant bits. Thus leaving us with what is supposed to help us make progress the fastest and most efficiently.

Tech also has an interesting property of helping us find new tech that will further improve our renormalization algorithm, i.e. tech is helping us improve and find new tech that will help us parse all of data.

Fun is thinking about this kind of system on a global level. Technology, as a (primarily software-based) combination of applied scientific method and engineering, in service of helping us grasp the world around us before it eats us. Different people and communities, different companies and countries, each with their own implementation of renormalization for information processing, interconnected in world-wide network of information filters racing against entropy.

Maybe the goal that we're after is to develop technology so much, that we're not just able to compete with new information being generated, but to get ahead of it. And only then, will we have enough time and opportunity to really understand the consequences of our potential choices. But so far, it seems that we're being consumed by the data, by our inferiority to process all that data and to even process technological changes that we're creating in order to process more data. It's like that we've invented electricity, but we're still so far from utilizing it for our own good, but we're only still at the stage of zapping ourselves over and over again and being masochistically excited by that happening.