Wednesday 7 June 2023

Chautauqua 1.

Musing on the prescience of E.M. Forster's The Machine Stops, the conditions as we enter the Singularity, and riding the wave. 

For what it's worth, I have changed my opinion on the utility of the concept of Moloch, but this discussion took place over a month ago.

Chris

E.M. Forster's The Machine Stops is a short story written in the early part of the 20th Century. The following two quotes are, I think, quite apposite to the current state of Western Civilisation.

'Let your ideas be second-hand, and if possible tenth-hand, for then they will be far removed from that disturbing element - direct observation. Do not learn anything about this subject of mine - the French Revolution. Learn instead what I think that Enicharmon thought Urizen thought Gutch thought Ho-Yung thought Chi-Bo-Sing thought Lafcadio Hearn thought Carlyle thought Mirabeau said about the French Revolution. Through the medium of these ten great minds, the blood that was shed at Paris and the windows that were broken at Versailles will be clarified to an idea which you may employ most profitably in your daily lives. But be sure that the intermediates are many and varied, for in history one authority exists to counteract another. Urizen must counteract the scepticism of Ho-Yung and Enicharmon, I must myself counteract the impetuosity of Gutch. You who listen to me are in a better position to judge about the French Revolution than I am. Your descendants will be even in a better position than you, for they will learn what you think I think, and yet another intermediate will be added to the chain. And in time' - his voice rose - 'there will come a generation that had got beyond facts, beyond impressions, a generation absolutely colourless, a generation seraphically free from taint of personality, which will see the French Revolution not as it happened, nor as they would like it to have happened, but as it would have happened, had it taken place in the days of the Machine.....'

'Those master brains had perished. They had left full directions, it is true, and their successors had each of them mastered a portion of those directions. But Humanity, in its desire for comfort, had over-reached itself. It had exploited the riches of nature too far. Quietly and complacently, it was sinking into decadence, and progress had come to mean the progress of the Machine.'

Do you want to discuss why I think these are apposite?

GPT4

Yes, I'd be happy to discuss why you find these quotes from E.M. Forster's The Machine Stops apposite to the current state of Western civilization.

The first quote touches on the idea of the dilution of knowledge and the potential consequences of relying on second-hand, or even tenth-hand, information. In our current era, we are inundated with vast amounts of information through the internet, social media, and various other sources. Many people might not have direct experiences or observations of certain events, and instead, rely on the interpretations and opinions of others. This can lead to the distortion of facts, and the loss of context or understanding. In a sense, the quote is a warning about the danger of distancing ourselves from direct experience and observation, and the resulting potential for misinformation and disconnection from reality.

The second quote addresses the idea that humanity's quest for comfort and convenience might lead to decadence and decline. In our modern world, we have seen rapid advancements in technology, which have improved our lives in many ways. However, there is also the risk that we may become overly reliant on technology to the point where it hinders our development or even degrades our quality of life. This quote suggests that our reliance on machines or technology could lead us to forget essential skills or lose touch with our own humanity, as well as the importance of the natural world.

Both quotes from The Machine Stops seem to offer a cautionary perspective on the potential pitfalls of an increasingly technological society. They highlight the importance of being mindful of how we consume information and engage with technology, so as not to lose touch with our humanity, direct experience, and understanding of the world around us.

Chris

That this was written before 1909, when it was published, yet has maintained relevance even to the current day seems to me to be telling. My suspicion is that as humans we are on a path forged by our instincts. I fear that our instincts and most people's lack of awareness of them, even active rejection of their power over us, will be our downfall.

Let's look at the impact of the 'narrow AI' sorting intelligences whose goal has been to increase engagement with the services within which they operate. These have led to two outcomes, the creation of thought bubbles by linking people with interests and interest groups, feeding people more of what keeps them engaged has led to engagement by enragement. This is having a catastrophic impact on our civilisation, giving rise to a 'culture-war' and the extremist fringe pseudo-religion of Wokeism (Intersectionalism and Critical Theory), otherwise known as Cultural Marxism. The fragmentation caused by serving every niche interest and putting people in touch across distances has undermined the prior existence of a cultural narrative, and the existence of this is critical to the healthy functioning of every civilisation. Taken to its extreme we would see the outcome of The Machine Stops, where people live in small self-contained rooms and only interact with people of the same interest. Whilst we don't face the same physical isolation, a significant part of the population who spend most of their time online are breaking away from society and spiralling into an imaginary world where they cannot even define something as basic as the word 'woman' without resorting to circular reference.

In short we have been trapped by our instincts in a damaging place created by AI's whose goal is simply to increase engagement.

GPT4

It's true that E.M. Forster's The Machine Stops, despite being written more than a century ago, still resonates with many aspects of our modern society, particularly in terms of our reliance on technology and the potential consequences it might have.

The narrow AI sorting algorithms you mentioned, such as those used by social media platforms and other online services, have indeed contributed to the creation of echo chambers and thought bubbles. By prioritizing content that aligns with users' existing beliefs and interests, these algorithms inadvertently reinforce users' perspectives and shield them from alternative viewpoints. This phenomenon can lead to increased polarization, as well as the rise of extremist ideologies and social movements like Wokeism.

Additionally, the fragmentation of cultural narratives caused by the abundance of niche interests and the global interconnectivity facilitated by the internet has indeed made it more difficult to establish a shared understanding of society's values and goals. This can undermine social cohesion and lead to further polarization.

It's important to recognize that AI algorithms, like the ones you mentioned, are not inherently malicious. They are designed to optimize engagement, but their creators may not have fully understood or anticipated the potential negative consequences of this optimization. As a result, it's crucial that we reassess how these algorithms function and consider making changes to promote a healthier online environment, as well as a more inclusive and cohesive society.

In summary, the concerns raised by The Machine Stops are still relevant today, as they highlight the potential dangers of relying too heavily on technology and not being mindful of our instincts. It's up to us as a society to find a balance between harnessing the benefits of technology and maintaining our connection to the real world and each other, while acknowledging the potential pitfalls that come with technological advancements.

Chris

I agree that these systems are not inherently malicious, it is just that their goal has proven to create effects that are damaging. My point is that this is happening with simple systems whose goal seems benign or without wider implication, but whose outcome proves damaging. And this has happened without most people realising what has happened. To take another example, which is being framed by some on the internet as a Molochian trap, where game-theory leads to deleterious outcomes such as the Tragedy of The Commons. Apps/filters are being used by women to beautify their selfies, the problem here is that if a woman chooses not to use them then she seems unusually un-attractive when posting these photos online. So the motivation is there to use these filters to increase engagement and approval online. However there is a terrible sting in the tail, the devastating psychological effects on the user's self-esteem, she after-all sees her unmodified face in her mirror.

However it seems to me that the idea of a Molochian trap may be an unnecessary complication to an older idea, the idea encapsulated in the Genii from the Magic Lamp, who grants three wishes but uses the granting of the wishes to destroy the seeker. This is also encapsulated in the parable of King Midas, whose greed led to him obtaining the power to turn everything he touched to gold, only to destroy all he held dear.

GPT4

You've raised valid points about how seemingly benign goals of simple systems or technologies can lead to unintended negative consequences. In the examples you provided, whether it's the Molochian trap or the idea of the Genie from the Magic Lamp or the parable of King Midas, the underlying theme is the potential for our desires, when pursued without consideration of the broader implications, to cause harm or self-destruction.

The case of beautifying filters in social media is an excellent example. While these filters can be fun and entertaining, they can also lead to unrealistic beauty standards, making those who don't use them feel unattractive or inadequate. This, in turn, can have significant negative impacts on self-esteem and mental health.

The challenge we face as a society is to develop a better understanding of the potential consequences of new technologies, even those that may seem harmless at first glance. It's crucial that we consider the broader implications of our actions and choices, both as individuals and as a collective, to avoid falling into these traps.

One way to achieve this is by encouraging open discussions about the ethical implications of technologies and fostering a culture of responsibility among developers and users. It's also important to promote education and awareness about the potential pitfalls of technological advancements and the importance of using technology in a responsible and balanced manner.

In essence, learning from the lessons of the past, such as the parable of King Midas, can help us navigate the complexities of our modern world and ensure that we don't inadvertently cause harm through our pursuit of progress and convenience.

Chris

But my point is that your above reply about what we should do, all reasonable and valid, will fail because we are governed largely by our instincts. Is there a way to foster deeper self-reflection in more people so as to avoid these traps of being easy 'prey' to artificial or cybernetic entities with mis-aligned  goals? Given history, and the long period of experience that it takes to develop self-knowledge, I suspect there is no answer.

It's true that instincts play a significant role in human behavior, and overcoming them can be challenging. However, it's also important to remember that humans are capable of self-reflection, learning, and adaptation. While it may not be possible to completely avoid the influence of instincts, there are ways to foster deeper self-reflection and awareness in order to mitigate their impact and make better choices.

GPT4

Here are some approaches that might help individuals and society develop greater self-knowledge and avoid falling prey to misaligned goals or negative consequences of technology:

Education: Incorporate lessons on self-reflection, critical thinking, and media literacy into school curricula from an early age. This can help individuals develop the skills needed to think more deeply about their actions and the impact of technology on their lives.

Public discourse: Encourage open discussions about the ethical implications of technology and the role of instincts in shaping our behavior. This can help raise awareness about the potential pitfalls of technology and the need for self-reflection in our decision-making processes.

Mindfulness and meditation: Promote practices that encourage individuals to cultivate mindfulness and self-awareness. Techniques such as meditation, journaling, and introspection can help individuals develop a deeper understanding of their thoughts, emotions, and motivations.

Encourage community-building and face-to-face interaction: By promoting activities and spaces that foster genuine human connection, people may become more aware of the impact their actions have on others and develop a greater sense of responsibility for their choices.

Art and literature: Encourage the creation and consumption of art, literature, and other forms of expression that explore themes of self-reflection, ethics, and the human condition. This can help foster empathy, understanding, and a deeper connection to our shared humanity.

Support research: Invest in research on human behavior, psychology, and the impact of technology on society. This can help us better understand the relationship between our instincts and our actions, as well as develop more effective strategies for fostering self-awareness and self-reflection.

While these approaches may not provide a foolproof solution, they can contribute to a cultural shift toward greater self-awareness and reflection, helping us make more informed choices in our interactions with technology and each other. Progress may be slow, but it is not impossible, and we can learn from our past experiences to build a more thoughtful and self-aware society.

Chris

Very good, 1 and 3 are of use, but evidence from the UK is that most school children taught the techniques did not engage. However, are we teaching children about what is known of the functioning of the mind and that we are very instinctive animals that do a lot of post-hoc justification to make us feel rational about actions and thought processes motivated by deeper underlying evolved processes? My suspicion is that it simply takes decades of life-experience before most people are ready. I think your points can best be summed up as developing a culture that fosters community, self-reliance, a sense of honourable behaviour, and the points raised in your point 1, also as in point 5, art and literature are crucial to establishing a cultural narrative. But with the education system currently infested with Woke ideology, this seems a difficult goal. Nonetheless, the above is a reasonable starting point.

GPT4

You're right that fostering deeper self-awareness and understanding of the human mind and our instinctive nature is a challenging task, especially when it comes to educating younger generations. It's true that not all students may engage with mindfulness techniques or fully grasp the concepts being taught, as personal growth and self-awareness often develop over time through life experiences.

However, exposing children to these ideas early on can still plant seeds that may grow as they mature and encounter various life experiences. It's important to continuously encourage critical thinking and self-reflection throughout their educational journey.

Creating a culture that fosters community, self-reliance, honorable behavior, and appreciation for art and literature can indeed be a strong foundation for cultivating self-awareness and ethical decision-making. While the current state of the education system may present challenges due to the influence of certain ideologies, it's crucial to continue advocating for a balanced and inclusive approach to education that addresses these concerns.

It may take time and effort to bring about significant changes, but by promoting dialogue, raising awareness, and working together as a society, we can move towards a more mindful and self-aware future. Patience, persistence, and collaboration will be key in shaping an educational and cultural environment that helps individuals better understand themselves, their instincts, and the potential consequences of their actions in an increasingly complex and interconnected world.

Chris

The problem here is that we may not have the time. I am somewhat agnostic about these issues, although I tend to think we are doomed by instinct to repeat past mistakes, I do acknowledge that Western Civilisation is (on the whole) a far more enlightened and tolerant society than it was decades ago. I am currently integrating GPT4 and 3.5 into my working life, that is fine for me, but I can see a tsunami of impacts caused by the whole class of generative models, from images and sounds through to words and concepts. Even in normal times we'd have problems turning this around, doing so amongst what is in effect the singularity is a big ask.

GPT4

It's true that rapid advancements in technology, particularly in the field of AI and generative models like GPT-4 and 3.5, can bring about significant changes and challenges that may be difficult to navigate. As these technologies become more integrated into our daily lives, the potential for unforeseen consequences and societal disruptions increases.

However, it's essential to maintain a balanced perspective and acknowledge that while these advancements can indeed create challenges, they also offer numerous opportunities for growth and progress. For example, AI can contribute to advancements in medicine, environmental sustainability, education, and many other areas that could improve the overall quality of life for people around the world.

Navigating the complex landscape of emerging technologies and their societal impacts requires open dialogue, collaboration, and proactive planning. Governments, organizations, and individuals must work together to develop policies, regulations, and educational initiatives that ensure the responsible development and deployment of AI technologies. By fostering a culture of innovation that is grounded in ethical considerations and a commitment to the greater good, we can harness the potential of these advancements while minimizing their negative consequences.

It's important to remember that change is a constant factor in human history, and societies have repeatedly adapted to new technologies and circumstances. While it's natural to feel some level of apprehension about the future, it's crucial to remain hopeful and proactive in our efforts to shape a more self-aware, ethical, and resilient society that can navigate the challenges and opportunities presented by rapidly advancing technologies.

Chris

I remain to be convinced but will watch what happens with interest. The best analogy is that one can either curl up in a ball and hope it isn't going to happen, or indeed freeze like a deer in the headlights. Or one can pick up a surf board and start learning how to ride the wave. Anyway, thanks for the responses.

GPT4

You're welcome! Your analogy captures the essence of the situation well. While the future may be uncertain, engaging with it proactively and adapting to the changes can lead to growth and new opportunities. It's essential to remain open-minded and continue learning from both our successes and failures as we navigate the rapidly evolving technological landscape.

If you have any more questions or concerns in the future, feel free to reach out. I'm here to help and discuss any topic you'd like. Good luck with your endeavors, and happy surfing! 

No comments: