Jamie:
Welcome to the list -- so let me mention a few things that we've
recently been discussing that you might find interesting.
One of them is the shift from an ELECTRIC "paradigm" to a DIGITAL one
(c. 2000) -- which aligns with Gregg's vision that we are on some sort
of a "precipice" and that it is this change that will allow his
"unified theory" to gain wider acceptance.
These *paradigms* come from the "technium" (a Kevin Kelly term) and
they "shape our behavior and and attitudes." Kelly made Marshall
McLuhan the "Patron Saint" of Wired Magazine, and my Center is based
on McLuhan's work, so I wonder if you've had a chance to look at any
of what he said?
The problem of "authority" (or, if you will, "totalizing systems") is
one that we are going to face -- big time. Throughout our lives, we
have been told that we are "free" (i.e. anti-authority) but, as many
suspect, that was largely an "engineered" fantasy (underpinning the
Cold War &c.)
In 1941, Gregory Bateson commented on a presentation by his then-wife,
Margaret Mead, about what was needed in "psychological warfare" terms.
He suggested a "maze in which the anthropomorphic rats have the
illusion of free-will" and, right on schedule, much of cognitive
psychology (and philosophy) came to the conclusion that we really
don't have anything of that sort (but we'll pretend that we do anyway,
leading to "compatibilism" &c.)
You seem to be trying to figure out what happens under DIGITAL
conditions (which is indeed what we all need to do), while using the
same language that was current under ELECTRIC conditions (i.e. where
most of your references come from.)
Have you considered that those folks you've been reading were trying
to "solve" a different *paradigm* (which is now obsolete) and that we
need a new "language game" for our new circumstances . . . ??
Mark
Quoting Mathew Jamie Dunbaugh <[log in to unmask]>:
> Hi Chance,
>
> A Singleton doesn't have to be an autocrat. The single decision-making
> agency could emerge out of the shared intentions of the world, such as a
> collective intelligence manifesting on the technium. The Moral Apex is the
> unified body of knowledge, norms, and purpose or intention, along with the
> unification of humanity. This could go along with a centralized
> intelligence mediating everything, but I'm more inclined to think it will
> simply be the evolution of the technium/web.
>
> It sure seems that divisive tribalism is the norm right now, but I suspect
> that it's merely a resistance to a larger trend towards cosmopolitanism and
> globalization. We aren't fighting any major wars and there aren't any
> serious conflicts between groups. I suspect that the Technium is slowly
> gathering us all together to participate in global decision-making.
>
> Consider how self-driving cars have to decide who to hit if they have to
> drive through a group of people. Ultimately we have to build absolute
> values into the technium. This might seem terrible and could be, but I
> think it's forcing us to think very hard to figure out what constitutes a
> just society. Moral relativism has nowhere to go. So because we are
> building this techno-social system that's gradually reprogramming society,
> I think we're more likely to program a techno-social system that works in
> the most universal interests. As long as a totalitarian surveillance system
> doesn't threaten people who resist the system, the system will evolve along
> the path of least resistance. But in the process, we have to build in
> absolute values and our collective intentions (the meaning of life).
>
> I don't think we'll ever become totalitarian in a way that loses free
> speech. That would be the cause of a downfall. Every trend shows
> exponential growth towards complexity and integration. I think that the
> technium, and the moral apex, will be made out of shared intentions. There
> will be a great deal of social engineering by people at the top, and it's a
> shock to see how fast people can be socially engineered when you think
> about how so many Republicans like Putin now. I'm just inclined to believe
> that things will continue to get better as they have so far. At the same
> time, I am worried about hyper-Orwellianism, but I don't think it will turn
> out that way.
>
> Max Tegmark has a great essay on how a company will likely end up taking
> over the world with an AGI, by controlling the media, in his new book Like
> 3.0. You can read it here:
> https://urldefense.proofpoint.com/v2/url?u=http-3A__nautil.us_issue_53_monsters_the-2Dlast-2Dinvention-2Dof-2Dman&d=DwIBaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HPo1IXYDhKClogP-UOpybo6Cfxxz-jIYBgjO2gOz4-A&m=zyQDAfdyvE6LLSL20y-9SjAqQiVGVi7YE8OVV2Lnt5g&s=WVKUfdnEXpcvZmDB9Q5Nbz9bnezEVs03fUsVdNPZOd8&e=
>
> It seems plausible to me.
>
> ############################
>
> To unsubscribe from the TOK-SOCIETY-L list:
> write to: mailto:[log in to unmask]
> or click the following link:
> http://listserv.jmu.edu/cgi-bin/wa?SUBED1=TOK-SOCIETY-L&A=1
############################
To unsubscribe from the TOK-SOCIETY-L list:
write to: mailto:[log in to unmask]
or click the following link:
http://listserv.jmu.edu/cgi-bin/wa?SUBED1=TOK-SOCIETY-L&A=1
|