Social Singularities

Manwë was known for many things, but wisdom and power are two that lead the rest of his attributes. Join the Councils and discuss the more weighty matters of Tolkien Fandom.

Social Singularities

Postby Jnyusa » Sun Jan 24, 2016 4:50 pm

This is the mental equivalent of going for a run in the park, a non-consequential exercise for the fun of it. The topic - Singularity in the Social Sciences - is one that I discuss with my students when I feel they need a break from equations. Allow me to simply copy the definition/explanation of Singularitarianism, i.e. the optimistic view of singularity - from Wikipedia rather than paraphrasing it:

Wikipedia wrote: Inventor and futurist Ray Kurzweil, who predicts that the Singularity will occur circa 2045, greatly contributed to popularizing Singularitarianism with his 2005 book, The Singularity Is Near: When Humans Transcend Biology

What, then, is the Singularity? It's a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future. To truly understand it inherently changes one's view of life in general and one's particular life. I regard someone who understands the Singularity and who has reflected on its implications for his or her own life as a “singularitarian.”


The concept of singularity is usually associated with artificial intelligence. Machines will become so human that human life as we know it will no longer be possible. This was the way it was presented by John von Neumann, father of information theory. Emphasis is on the words 'as we know it,' implying not that humans would be terminated but that their culture would be changed so profoundly that future generations would be unrecognizable to past generations. In the slightly broader approach taken by Kurzweil, one does not have to assume the actual achievement of 'humanoid' machines but any technological change, particularly change experienced as abrupt transformation, would have irreversible, recognition-threatening impact on human culture. As expected, not all futurists agree that this is a good thing.

Then, you have folks like Michio Kaku, physicist, who, while liberal in his estimate that amazing things such as 'Beam Me Up, Scotty' will achieve prototype form possibly within the lifetime of our younger posters, is quite conservative in his predictions about artificial intelligence. Hundreds of years away, is his estimate. Certainly not decades.

I am inclined to agree with Kaku and go him one further by saying that it will never happen, not in the form that von Neumann envisioned, unless there is a fundamental taxonomic change in the way we perceive, generate, and manipulate information. But in the broader sense that Kurzweil allows, I would say that it has already happened several times within my lifetime, and that is what I ask my students to discuss; because I grew up one side of several singularities, and they grew up on the other side. Are we incomprehensible to one another? If yes, how wide is the chasm? If no, what is it that has allowed us to span the chasm together?

Let me take on the strict construction of singularity first: can machines will become human? or so like humans that the human-machine relationship and the corresponding human-human relationship will be fundamentally changed?

Well, some of you here enjoy bashing science, so let me bash science for a moment. Because science is so specialized it is admittedly vulnerable to tunnel vision. Computer science, information science, is particularly vulnerable, I think, because there are so few physical constraints in the virtual world. But the consequence of information always takes physical form - if it does not, then it is by definition inconsequential - and the minute we have physical form we have physical limitation. During the dot.com bubble/bust of 2000, the main reason for business failure was that the dot.com entrepreneurs forgot that they have to deliver physical inventory, and there are rules that go with physical inventory. All that boring, managerial, logistic stuff that they were told didn't matter any longer in the information age turned out to matter quite a lot.

So what are information scientists overlooking that makes me suspect their conclusions? Depreciation.

Machines, whatever their self-learning capability, are made out of non-living resources. They don't regenerate naturally. They depreciate. Hybridization? - the wet-wiring of humans? - possibly that will happen, but it will always be the organic component that allows mutation and natural selection to be the drivers of evolution. And the great, superlative advantage of mutation and natural selection is that they are not structured as binary systems. They rely, as near as we can figure, on randomness. And randomness is infinitely richer than binary choice. As long as information is reduced to binary code the top-most command will be an either-or command. That is not true for organic change. Organic change is Chaotic (per Robert May, biologist). Nature organizes information in a way that is taxonomically different from the way human deduction organizes information (per Gregory Bateson). Bateson, also a cyberneticist, went so far as to say that information transmittal in nature more resembles the free association of schizophrenia than it resembles ordinary human thought.

Paradoxically, what randomness allows is endless failure. And for that to result ultimately in success, you need lots and lots of time. We could, probably, maybe?, program a non-regenerating physical system to self-teach and evolve chaotically, but neither depreciation nor human investment parameters could withstand the millions of trials and waiting time, with no outcome certain, required for a chaotic system to come up with a 'desirable' outcome. The fundamental teleology of human economy demands determinism, and I do not see determinism ever producing the kind of singularity von Neumann suspected. I do not think we will ever get machines to approximate human thought. That's point number one.

Point number two bends in the opposite direction. The fact that singularity would not take the form of artificial intelligence does not mean that certain technological changes cannot alter our self-perception so radically that communication across the cultural rift is strained nearly to impossibility.

So I relate to my students the three events in my lifetime that I believe qualify as singularities, events that so profoundly influence one's frame of reference that we are no longer, in a social sense, the same species - and by 'we' I mean my students and myself, since my perception-forming years were lived before the singularities.

1. Being able to see the Earth from outer space: That we lived on a rock orbiting the sun was known to my generation as a theoretical construct, but their generation was born seeing it on TV. I believe that the human-planet relationship, the human-habitat relationship is fundamentally different for them than it is for me. Exactly how it is different is something we can explore together, must explore together in fact, because I can't take their world-view for granted.

2. Computers that speak human language: When I was in college and graduate school I had to learn three foreign languages in order to use a computer Then, in 1984, the computers began to speak our language. The only computers my students have ever known spoke a human language. They have never had to learn computer language, unless they wanted to become programmers. The computational and information power that this put into the hands of ordinary people is inestimable, imo. So their relationship to information is very different from mine. They are more trusting of digitally stored and transmitted information, for one thing, I think. And they are more vulnerable to forgetting that their computers rely on physical infrastructure that they cannot see, e.g. electricity. And it is hard to imagine information being scarce, from within their frame of reference.

3. The Millennium: whether or not you amuse yourself with petty arguments over when the new millennium truly began, institutional recognition is given to January 1, 2000, and that day is the first time in human history that every human culture on earth observed the same calendar date as being significant, as being the start of an era. The fact that tribal calendars continue to exist is irrelevant. We sat and watched our TVs as midnight arrived on the island of Kiribati and said to ourselves as a planet that a new millennium had begun. The perception that changed as a result of this is not our perception of calendars but our perception of our species as being "the same" everywhere, subject to the same forces of time and mortality, inhabiting a geography calibrated to a single metric (the time zone), unified by our participation in an era. There were people on Earth who may have been excluded from the celebration of this event, but nowhere were people excluded from the knowledge of this event. I submit that by virtue of being born shortly before the millennium so that this framework is their framework for understanding humanity-in-time, my students conceive of the human race as an apriori unity, whereas for me our oneness will always be a chosen perspective that is not shared by all my contemporaries.

These were iconic, culture-transforming events, global, species-wide in their impact. But the consequences have been very different from what futurists predict for singularities. They have disrupted human culture but they have not created an unbridgeable chasm between past and future.

And you can't define a singularity tautologically. You can't just say, well, if it didn't create an unbridgeable chasm then it wasn't a singularity. The question really is whether there is any event short of sudden annihilation of the human species, that would create such a terminus. And I think that the answer to this question is also "no." And again it is the status of humans as living organisms that prevents this. There are always some of us who live before and after the event, who bridge the two worlds and can translate between them. The only requirement, I think, is that we realize that it is not just factual knowledge that has changed but frame of reference.

I've been thinking a lot recently about communication across cultural boundaries and that's where this came from. Any and all opinions, readings, personal experiences are welcome.
User avatar
Jnyusa
Mariner

 
Posts: 5934
Joined: Sun Jan 19, 2003 8:24 pm
Top

Re: Social Singularities

Postby LleuLlewGyffes » Tue Jan 26, 2016 5:24 pm

Beautifully stated, irrespective of quibbles. However, I feel the most recent singularity that affects us all is 9/11. There was a world phase shift after that event, and the consequences are still reverberating. It ushered in a new narrative in which state conflict was usurped by non state actors. It set the stage for a new reality of borderless conflict, of intrusion irrespective of "international law". To be exact, it was not the act itself that demolished the "ancien regime", but the response. It was a casus belli for wars of aggression that, previously, would have been deemed "illegal".

Deserving of note, at least, I think.
LleuLlewGyffes
Citizen of Imladris
 
Posts: 33
Joined: Thu Jul 31, 2014 4:24 pm
Top

Re: Social Singularities

Postby Storyteller » Wed Jan 27, 2016 1:34 am

I think what is often being missed in discussions of technology's development direction is the fact that technology - and the science behind it - does not develop spontaneously and - more importantly - does not get adopted spontaneously. It gets created and adopted as an answer to social demand. Singularity in the "strict" sense will not happen because the human society does not want it, simple as that.

Once upon a time, in discussions with a fellow gadget aficionado, I predicted the failure of Google Glass and the rise of the smartwatch. The reasoning was threefold. First, fashion and social acceptability. When it comes to wearable tech, low-visibility beats high visibility every time for social reasons. Anything you wear is a fashion accessory and is subject to the same social perception as conventional glasses, watches, handbags etc. Smartwatches conformed and customized to fashion demands far easier than Google Glass, and were less visible and in-your-face.

Second, there was a clear demand for a smartphone accessory that would enhance people's interaction with phones to ease notifications fatigue. (I didn't think back then of fitness tracking, but in hindsight demand for that was also in place; electronic pedometers were produced since the 1980-s, and integration of pedometer functionality into MP3 players and phones began in the 2004) There was no clear demand for a HUD in one's day-to-day life; Google Glass was an attempt to create a demand not to answer an existing one.

Third, convenience. A device worn on your eyes is a hassle when it isn't a help; a device worn on your wrist is out of the way until you need it.

From that viewpoint, I don't see much use in the near future for computers that understand and speak natural language. It's a sideshow feature. Once the "coolness" factor wears off, natural language does not actually ease interaction with computers; it represents more of a step back from graphical user interface towards command line interface. That's why Siri, Cortana and Google Now remain little more than toys that get forgotten by device owners after a short while; unless the circumstances force hands-free operation, it is both faster and more convenient for people to thumb away at app icons and menu choices.

I think techology changes our societies more subtly and more unpredictably than we expect it to, and its impact differs widely between cultures. Smartphones - the defining technology of 2010-s - had a vastly different impact in African societies that had little to no landline phone infrastructure and little to no personal computer ownership than they did in Western Europe.
"...Their aim in war with Germany is nothing more, nothing less than extermination of Hitlerism... There is absolutely no justification for this kind of war. The ideology of Hitlerism, just like any other ideological system, can be accepted or rejected, this is a matter of political views. But everyone grasps, that an ideology can not be exterminated by force, must not be finished off with a war.” - Vyacheslav Molotov, ""On the Foreign Policy of the Soviet Union", 31 October 1939
User avatar
Storyteller
Mariner
 
Posts: 7056
Joined: Sat Aug 31, 2002 7:46 am
Top

Re: Social Singularities

Postby hamlet » Wed Jan 27, 2016 8:16 am

Storyteller wrote:I think techology changes our societies more subtly and more unpredictably than we expect it to, and its impact differs widely between cultures. Smartphones - the defining technology of 2010-s - had a vastly different impact in African societies that had little to no landline phone infrastructure and little to no personal computer ownership than they did in Western Europe.


It's probably equally important to realize just how much technological development is driven by business models, fashion trends, and marketability. New smartphones are released every year - or every 6-8 months if you're an iCultist - and people are absolutely convinced that they have to have it even if any developments in the new units are purely cosmetic or, at best, marginally incremental (i.e., my iphone 5s that Verizon foisted upon me is equally capable as the new iPhone 6+ and differs only in size and appearance for the most part but mine was free and the other was $250 after a several hundred dollar discount).

People want the latest and greatest regardless of what actual utility it brings to them. It's cool. It's trendy. Actual innovation be damned, get me my new toy! Hell, I've seen people wearing two smartwatches, one on each wrist, to go with both their iPhones or, in one particularly puzzling case, to display different data from the same iPhone. And Apple (and Samsung and Motorola etc.) will continually cater to that.

Touch screen technology is also the new trendy. It's great for a smart phone which is a situation where you don't have or want additional input devices floating around to be lost or damaged. It's good for tablets for pretty much the same reason. But it's hugely obnoxious for business machines where 99% of the interaction with your computer at a fixed desk (excluding folks who travel from job site to job site and for whom a tablet does make sense) is via keyboard and mouse, the most efficient and unbeatable means of inputting data and interacting with it on screen that exists. Touch screen business machines are horrifying (trust me, I know, they "gifted" me one last year and I had to demand my old machine back), but because the concept of touching your screen rather than relying on technology that, while still extremely efficient, are 40 years old or more.

Technology development, at least in a non-military environment (and maybe a scientific environment) is driven largely by convincing people to buy your crap rather than filling a need. I would actually argue that very few folks in a First World nation actually need a smartphone let alone a new one every year. Why do we need a computing device with us at all times? When did we become so busy that whatever it is we wanted to do couldn't wait till we got back to our computer, which is dramatically more capable than most cell phones nowadays anyway? Why? Because the commercials and trendiness told us we did, and that's what's driving the development of new gadgets.

Which is, of course, a great simplification. Of course somebody who manages to fill an actual need will see his invention adopted quickly and broadly, but it will, at the end of the day, go hand in hand with folks wanting to buy the newest toy.
User avatar
hamlet
Ringbearer
 
Posts: 10559
Joined: Sun Apr 29, 2001 12:01 pm
Top

Re: Social Singularities

Postby Storyteller » Wed Jan 27, 2016 12:31 pm

hamlet wrote:It's probably equally important to realize just how much technological development is driven by business models, fashion trends, and marketability. New smartphones are released every year - or every 6-8 months if you're an iCultist - and people are absolutely convinced that they have to have it even if any developments in the new units are purely cosmetic or, at best, marginally incremental (i.e., my iphone 5s that Verizon foisted upon me is equally capable as the new iPhone 6+ and differs only in size and appearance for the most part but mine was free and the other was $250 after a several hundred dollar discount).

"Free" it wasn't. And that other one wasn't $250 either. Factor in the hidden costs of the plan your phone comes with.

I would agree that technology development is largely driven by marketability and fashion trends, but ultimately whether a new device class gets adopted or fails is a matter of whether or not it finds a usability niche. Otherwise sales tank no matter how much one invests into marketing. Looking back, it's actually surprising just how many technologies die a quick death after a glorious start. The aforementioned Google Glass is one example.

People want the latest and greatest regardless of what actual utility it brings to them. It's cool. It's trendy. Actual innovation be damned, get me my new toy! Hell, I've seen people wearing two smartwatches, one on each wrist, to go with both their iPhones or, in one particularly puzzling case, to display different data from the same iPhone. And Apple (and Samsung and Motorola etc.) will continually cater to that.

Well yes, people do want the latest and greatest when lower-tier devices could suffice. iPhones are vanity items, like Michael Kors handbags. (I do disagree about Motorola, they're a different and interesting case in the phone world).

I would actually argue that very few folks in a First World nation actually need a smartphone let alone a new one every year. Why do we need a computing device with us at all times? When did we become so busy that whatever it is we wanted to do couldn't wait till we got back to our computer, which is dramatically more capable than most cell phones nowadays anyway? Why? Because the commercials and trendiness told us we did, and that's what's driving the development of new gadgets.

That is not true at all. A smartphone is so necessary in the modern world that even refugees have a hard time without them.

(For an awful lot of people, including in the First World, their smartphones are their main - and even only - gateway to the Internet and the possibilities it offers).

P.S. Back to the subject of technological singularity, there are two technologies that have the potential for a spectacular social impact in the short term. One is the research into direct brain-computer interface. DIrect neural connection between human brain and computer chips could have a wide-ranging spectrum of effects - it could help the blind people see, the deaf people hear, it could ultimately allow the direct exchange of information between human brains - images, thoughts, emotions - via electronic bridges. Another technology is automated translation, like the recently launches Skype Translator. It is pretty amazing to think of the language barrier coming down PERMANENTLY and UNIVERSALLY.
"...Their aim in war with Germany is nothing more, nothing less than extermination of Hitlerism... There is absolutely no justification for this kind of war. The ideology of Hitlerism, just like any other ideological system, can be accepted or rejected, this is a matter of political views. But everyone grasps, that an ideology can not be exterminated by force, must not be finished off with a war.” - Vyacheslav Molotov, ""On the Foreign Policy of the Soviet Union", 31 October 1939
User avatar
Storyteller
Mariner
 
Posts: 7056
Joined: Sat Aug 31, 2002 7:46 am
Top

Re: Social Singularities

Postby hamlet » Wed Jan 27, 2016 1:30 pm

Storyteller wrote:
hamlet wrote:It's probably equally important to realize just how much technological development is driven by business models, fashion trends, and marketability. New smartphones are released every year - or every 6-8 months if you're an iCultist - and people are absolutely convinced that they have to have it even if any developments in the new units are purely cosmetic or, at best, marginally incremental (i.e., my iphone 5s that Verizon foisted upon me is equally capable as the new iPhone 6+ and differs only in size and appearance for the most part but mine was free and the other was $250 after a several hundred dollar discount).

"Free" it wasn't. And that other one wasn't $250 either. Factor in the hidden costs of the plan your phone comes with.


Yes, I am quite aware and when I said "free" I did indeed mean "at no additional cost to myself beyond what I'm already paying anyway." And even that wasn't quite correct since Verizon still charges a $40 upgrade fee this month and then, because I was forced into an iPhone (which I hate) an $11/month insurance fee if I were to, say, throw the damned thing against a wall in frustration and rage.
I would actually argue that very few folks in a First World nation actually need a smartphone let alone a new one every year. Why do we need a computing device with us at all times? When did we become so busy that whatever it is we wanted to do couldn't wait till we got back to our computer, which is dramatically more capable than most cell phones nowadays anyway? Why? Because the commercials and trendiness told us we did, and that's what's driving the development of new gadgets.

That is not true at all. A smartphone is so necessary in the modern world that even refugees have a hard time without them.

(For an awful lot of people, including in the First World, their smartphones are their main - and even only - gateway to the Internet and the possibilities it offers).


I have actually yet to meet anybody who has their smartphone (or their tablet actually) as their main or only internet portal. Most everybody has either a desktop or a laptop that I am at all aware of and, in terms of money, it's actually less cost effective the pay for a smartphone than it is to just get an inexpensive laptop from Walmart and internet service that includes phone service most times. Granted, that's a small sample size and all, but still, I've not seen that at all here.

P.S. Back to the subject of technological singularity, there are two technologies that have the potential for a spectacular social impact in the short term. One is the research into direct brain-computer interface. DIrect neural connection between human brain and computer chips could have a wide-ranging spectrum of effects - it could help the blind people see, the deaf people hear, it could ultimately allow the direct exchange of information between human brains - images, thoughts, emotions - via electronic bridges. Another technology is automated translation, like the recently launches Skype Translator. It is pretty amazing to think of the language barrier coming down PERMANENTLY and UNIVERSALLY.


Maybe. I would say AI is more potentially game changing. True AI that is.



And I'm interested in singularities that are not technological. Actual social singularities than technology affecting society.
User avatar
hamlet
Ringbearer
 
Posts: 10559
Joined: Sun Apr 29, 2001 12:01 pm
Top

Re: Social Singularities

Postby Storyteller » Fri Jan 29, 2016 12:25 pm

hamlet wrote:I have actually yet to meet anybody who has their smartphone (or their tablet actually) as their main or only internet portal. Most everybody has either a desktop or a laptop that I am at all aware of and, in terms of money, it's actually less cost effective the pay for a smartphone than it is to just get an inexpensive laptop from Walmart and internet service that includes phone service most times. Granted, that's a small sample size and all, but still, I've not seen that at all here.

I have. My very own dad has learned to use his Android tablet for everything, from watching TV to creating Excel files. I know young people who don't bother getting WiFi at home so long as their cellular data allowance is large enough. In fact, when my trusty Packard Bell gives up the ghost one day, I'm not at all sure I'll be replacing it with another Windows machine. At this point, when phones and tablets come with 3-4 GB RAM and octa-core processors, unless one is into serious gaming and needs high graphics performance, it is entirely possible to make do with a phone and /or a tablet.

There are reasons why PC sales have been consistently declining for years ever since the rise of the smartphone and the tablet. What is a typical home computer is used for? Email, social networking, instant messaging /video chatting, media consumption, Word / Excel processing, gaming. A mid-range 2015 phone with a quad-core processor and 2GB RAM can handle all of that except graphics-intensive high-end gaming. With OTG, WiFI and Bluetooth you can hook it up to peripherals including a TV or a monitor. That setup won't be as powerful as a 2015 computer, but it'll be good enough, especially if you must for financial reasons limit yourself to one device; a Snapdragon 801 has more processing power than an Intel Pentium 4 from the early 2000s and beats the mid-90s SUPERCOMPUTERS. (Let's also not forget that modern-day budget laptops, tablets and phones often run on the same class Intel Atom processors).

Maybe. I would say AI is more potentially game changing. True AI that is.

True AI... is an incoherent concept. I don't think they'll ever make a true AI in the sense of human-level intelligence. You can replicate every nuance of human brain and still not get there, because human intelligence does not boil down to reasoning and processing; in fact, more than anything else it is our irrationality and non-linearity of our thinking that defines human intelligence and creativity.

And I'm interested in singularities that are not technological. Actual social singularities than technology affecting society.

This is precisely why I think that real-time universal translation is so significant. It is one such "social singularity", tearing down an age-old barrier between human groups.

One famous science fiction writer (don't remember off the top of my head which one) once observed that the reason why The Matrix style cyberpunk is rarely written in Russia and Eastern Europe is because it's hard to fear the rebellion of machines when one can't find a single working phone booth in one's city.
"...Their aim in war with Germany is nothing more, nothing less than extermination of Hitlerism... There is absolutely no justification for this kind of war. The ideology of Hitlerism, just like any other ideological system, can be accepted or rejected, this is a matter of political views. But everyone grasps, that an ideology can not be exterminated by force, must not be finished off with a war.” - Vyacheslav Molotov, ""On the Foreign Policy of the Soviet Union", 31 October 1939
User avatar
Storyteller
Mariner
 
Posts: 7056
Joined: Sat Aug 31, 2002 7:46 am
Top

Re: Social Singularities

Postby Jnyusa » Sat Jan 30, 2016 1:32 pm

Thanks for the interesting replies, everyone. I'm stuck correcting papers but will try to get back here by Monday night.
User avatar
Jnyusa
Mariner

 
Posts: 5934
Joined: Sun Jan 19, 2003 8:24 pm
Top

Re: Social Singularities

Postby Jnyusa » Sun Jan 31, 2016 4:08 pm

Storyteller and Hamlet,
Interesting comments on technology drift. I agree with both of you that markets drive technology for the most part, and that product by its own virtue is unlikely to cause a social singularity. There's Corningware in every US kitchen now because of the space program, but Corningware is not what caused the space program to create a singularity (if you agree with me that it did create one).

The social singularity that I described was, I think, an unanticipated consequence of the space program; and perhaps that element of surprise is one of the things we should incorporate into our definition of a singularity. It can't be foreseen ... not in the sense that AI people usually mean this, i.e. "you can't imagine now what tech discovery we are going to make that will change the whole society," because although the three things I've mentioned did result from technological developments they were not the aim of the technological development, nor did those who developed the technology foresee those particular outcomes. (I think not, anyway.) Storyteller, you said that we won't get that kind of change because people don't want it, and I'm pretty sure I agree with you that we won't get that kind of change purposively, but I don't want to rule out getting it by accident while we are pursuing something else.

LieuLiewGyffes,
That's also an interesting thought that 9/11 created a singularity by changing our perception of warfare. I think you are right that a sea change has taken place in our perception of warfare, but I also think that 9/11 was a continuation of a process that began earlier. And then, if we look at recent history I would have to say that the splitting of the atom had a more profound effect on our perception of warfare than 9/11 did. So did the several systematic genocides of the 18th through 20th centuries, altering our perception of what was thinkable. I might nominate the atom bomb for singularity status, and suggest that its impact is much like that of the space program, installing in our collective gut a realization that we are vulnerable as a species entire, we are erasable from the planet.

But I've been thinking along a different direction, actually, because what intrigues me about these events is that their disruptive influence did not create the unbridgeable social chasm that was predicted for them. As startling as they were, they were woven into our collective history. We absorbed them. And this led me to think about unbridgeable social chasms in a different way as well, because I think that we do have such things in our collective history, and they happen when annihilation is thorough enough to prevent one generation from weaving its experience into the next generation's memory. For most of the Amerindian cultures of North and South America it is truly the case that, if not in one generation then in a very small number of generations the ability to communicate intergenerationally was lost, or nearly lost, because the language itself was lost. The people who spoke it were annihilated. Surely for them the coming of Europeans to American shores was a singularity that fulfilled all of its awful promise.

This is ... hmm ... well, one wants to say that radical extinction is the norm on planet earth. 99% of all the species that have ever lived are today extinct, and I suspect that a similar proportion would be true for all cultures and languages that have ever existed among humans. We are ourselves the last surviving subspecies of a multitude of hominid species. But eohippus did not exactly disappear. It evolved into something new. It didn't vanish but became different enough over time to require a new label. And most of our extinct cultures probably met the same end. Through migration they became different enough from their forebears to require a new label. That is not the same thing as being annihilated abruptly ... as, say, the dinosaurs were, or the thousands of Amerindian languages that once existed on this continent.

For those changes that result from human action, I think we can learn the rules for preservation. That's what interests me, really, the rules for preservation. Not really the same thing as the rules for stasis. We know we can't have stasis, but we can have effective bridges, because we do see them at work some of the time. That's what I'm thinking about.
User avatar
Jnyusa
Mariner

 
Posts: 5934
Joined: Sun Jan 19, 2003 8:24 pm
Top

Re: Social Singularities

Postby Storyteller » Mon Feb 01, 2016 12:35 pm

Jnyusa wrote:Storyteller and Hamlet,
Interesting comments on technology drift. I agree with both of you that markets drive technology for the most part, and that product by its own virtue is unlikely to cause a social singularity. There's Corningware in every US kitchen now because of the space program, but Corningware is not what caused the space program to create a singularity (if you agree with me that it did create one).

If Corningware counts as a "social singularity" than just about any new technology does. It was an incremental improvement, not a revolution, and it did not alter the society to the point of being unrecognizeable to the generation before Corningware. It is not on par with technologies that ushered in a true social revolution - gunpowder, antibiotics, radio, railroad or photography. I believe you're using the term "singularity" far too broadly.

The social singularity that I described was, I think, an unanticipated consequence of the space program; and perhaps that element of surprise is one of the things we should incorporate into our definition of a singularity. It can't be foreseen ... not in the sense that AI people usually mean this, i.e. "you can't imagine now what tech discovery we are going to make that will change the whole society," because although the three things I've mentioned did result from technological developments they were not the aim of the technological development, nor did those who developed the technology foresee those particular outcomes. (I think not, anyway.) Storyteller, you said that we won't get that kind of change because people don't want it, and I'm pretty sure I agree with you that we won't get that kind of change purposively, but I don't want to rule out getting it by accident while we are pursuing something else.


Again, social demand. New technologies that are developed but not adopted due to lack of demand do not create social revolutions. A LOT of revolutionary technologies has had to wait decades and centuries until it found a practical application that changed the face of the human society. Just as an example- and mainly because I'm writing from the kitchen - first induction-based cooking stove patents were issued in 1900-s, first functional induction cooktop was demonstrated in 1950-s, mass production did not begin until the mid-70s, and widespread adoption of a technology obviously superior to the "normal" electrical cooktop is only really starting now.

Regarding artificial intelligence, one big problem is the view of human brain as an algorithmic computer that can be reverse-engineered on hardware and software level. If you think about it, human brain AND human cognition both work exactly in the opposite way from its technological imitations. The human brain is not composed of standardized parts, it's a collection of highly specialized non-standard parts. The human experience goes from general to particular; when we're infants we gain general understanding of the world and later hone our skills and knowledge in specific areas where we are faced with challenges. Attempts to create AI software, on the other hand, starts off narrowly specialized and then attempts to branch out. A computer designed to beat humans at Jeopardy and the computer that can beat Harry Kasparov in chess are not using the same algorythms and cannot independently learn to do each other's jobs.
"...Their aim in war with Germany is nothing more, nothing less than extermination of Hitlerism... There is absolutely no justification for this kind of war. The ideology of Hitlerism, just like any other ideological system, can be accepted or rejected, this is a matter of political views. But everyone grasps, that an ideology can not be exterminated by force, must not be finished off with a war.” - Vyacheslav Molotov, ""On the Foreign Policy of the Soviet Union", 31 October 1939
User avatar
Storyteller
Mariner
 
Posts: 7056
Joined: Sat Aug 31, 2002 7:46 am
Top

Re: Social Singularities

Postby Jnyusa » Mon Feb 01, 2016 9:03 pm

Storyteller wrote:If Corningware counts as a "social singularity" ...


No, I said it does NOT count as a social singularity.

New technologies that are developed but not adopted due to lack of demand do not create social revolutions.


A singularity is not the same thing as a revolution. It is not a change in what we are able to do; rather it change the way we understand our relationship to technology and to other humans and (in some cases) to the planet we live on.

I had opportunity about 20 years ago to attend a conference where Kenneth Boulding was the guest of honor. He made the startling statement that the information revolution was insignificant by comparison with the invention of railroads and telegraphs. We sort of laughed at his perspective then, because Apple had at that point changed so radically what we were able to do. But now in retrospect I think there is truth in what he said, if we are talking about do-ability alone. Farther-faster is what railroads and telegraphs enabled us to do, and that's what the internet enables us to do. But you have to remember that at the beginning of the information age you had to learn Basic of Fortran or one of the other computer languages in order to use a computer. In the early 1980s we could communicate already with far away computers - send them our data and wait for them to process it - but we had to learn a foreign language to do that. The thing that was singular about the Apple computer was that now everyone could do what only specialized people had been able to do before. By comparison one never needed special skills or training to talk on a telephone or ride on a railway. As soon as they came into existence they could be used by anyone. Creating them took special skill but using them did not. But with the computer age, this technology that had been building in power and scope for several decades but was available to the very few was abruptly turned loose on the whole population. It would be as if you woke up one morning and there were flying cars everywhere. Everyone's relation to the world change simultaneously. And it changed the individual's power in the world ... that's a significant aspect too, I think. Well ... anyway.

If you think about it, human brain AND human cognition both work exactly in the opposite way from its technological imitations.


Yes, I think that's exactly right. That's my take on it too.
User avatar
Jnyusa
Mariner

 
Posts: 5934
Joined: Sun Jan 19, 2003 8:24 pm
Top

Re: Social Singularities

Postby hamlet » Tue Feb 02, 2016 9:13 am

Storyteller wrote:Regarding artificial intelligence, one big problem is the view of human brain as an algorithmic computer that can be reverse-engineered on hardware and software level. If you think about it, human brain AND human cognition both work exactly in the opposite way from its technological imitations. The human brain is not composed of standardized parts, it's a collection of highly specialized non-standard parts. The human experience goes from general to particular; when we're infants we gain general understanding of the world and later hone our skills and knowledge in specific areas where we are faced with challenges. Attempts to create AI software, on the other hand, starts off narrowly specialized and then attempts to branch out. A computer designed to beat humans at Jeopardy and the computer that can beat Harry Kasparov in chess are not using the same algorythms and cannot independently learn to do each other's jobs.


Trying not to dive too far down this particular rabbit hole, but . . .

AI is not, should not, and cannot be thought of as replicating a human like "mind" on a computer. That's simply not plausible short of transferring the contents of a human mind into a computer, and even then, we have no idea what would happen.

Really, though, the idea of creating a computer with a set of algorithms and programming complex enough to learn, adapt, and, at some point, act upon its own initiative and in its own interests is not that far fetched. It would be fairly alien to us even though, conceivably, we'd create it, but that's what AI would end up being.

I would argue that my computer here at work is that since it often does things that I do not tell it to do, but that's probably just Windows . . .


As for the "information age" being nothing compared to railroads, rather I think it was incremental and inevitable upon the creation of faster reliable transport. Eventually, the idea of "NOW" changes to more closely converge with our technology. The speed in which we can get something - whether it be a bit of information or a package we've ordered from Amazon or lunch - is always playing catch-up with our technology, but it has certainly continually changed how our society works. Trains made travel so much easier that people no longer spent so much of their lives within 10 miles of their own homes and made getting goods from across the country child's play compared with the ships that used to be required. Then, we started to have ways of sending information (more and more of it really) virtually instantaneously and now the idea of having to wait even to get home to find out some bit of trivia is alien to certain young folks nowadays. It's actually changed the way we think and retain information. it's changed the manner in which we read.

It doesn't quite fit the definition of a singularity simply because of the time scale, but I think that the idea of how fast one can get something is actually all built upon previous advancements. These things were inevitable, really. We've yet to see something that would change the world virtually overnight. After all, my grandfather used a cellular phone, or at least understood them well enough not to be mystified by them or what they could do even if he wasn't particularly good or interested in working them.

Singularities are, by definition, difficult or impossible to conceive of beforehand, so we're reduced to speaking in very broad terms here. We might actually get a bit more traction talking about past singularities, such as the creation of a written language and, then, the creation of printing presses. Etc.
User avatar
hamlet
Ringbearer
 
Posts: 10559
Joined: Sun Apr 29, 2001 12:01 pm
Top

Re: Social Singularities

Postby Jnyusa » Thu Feb 04, 2016 7:24 am

Hamlet wrote:Eventually, the idea of "NOW" changes to more closely converge with our technology.


That's a good point, Hamlet. The human relationship to Time is one of our culture-defining perspectives, I think. Anything that changes that abruptly would probably create a singularity. (Maybe that's why people are so interested in time travel.)
User avatar
Jnyusa
Mariner

 
Posts: 5934
Joined: Sun Jan 19, 2003 8:24 pm
Top

Re: Social Singularities

Postby hamlet » Thu Feb 04, 2016 8:56 am

Jnyusa wrote:
Hamlet wrote:Eventually, the idea of "NOW" changes to more closely converge with our technology.


That's a good point, Hamlet. The human relationship to Time is one of our culture-defining perspectives, I think. Anything that changes that abruptly would probably create a singularity. (Maybe that's why people are so interested in time travel.)


That and the Christopher Lloyd movies maybe.

Humanity's relationship to time is a really curious one. I mean that. It's at once wildly artificial and strongly grounded in the physical facts of our reality. The calendar we use today is a ridiculous kludge of fixes to a semi-broken system that, in the end, is about the best we can manage at the moment short of tearing it down and starting again from the ground up.

You want to really get messed up, think about what happens when/if humans start to colonize space. Time takes on whole new dimensions.

Also, for gits and shiggles, look up "Metric Time." It's truly screwed up.
User avatar
hamlet
Ringbearer
 
Posts: 10559
Joined: Sun Apr 29, 2001 12:01 pm
Top

Re: Social Singularities

Postby Jnyusa » Sat Feb 13, 2016 3:48 pm

May I take this conversation onto a tangent?

This is not something that I discuss with my students but it is something that I think about when I think about AI.

Asimov's three laws of robotic meta-ethical behavior.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

The term 'singularitarianism' connotes a positive attitude toward the singularity that would be created ostensibly by the AI breakthrough. But I am wondering whether it is even remotely possible for such a breakthrough to have a positive impact on humankind. Asimov was an optimist, he really was. He founded MENSA believing that a group of geniuses could achieve world peace ... that was in fact his purpose in creating MENSA ... and he took for granted the idea that geniuses would want to achieve world peace. That they wouldn't be more attracted by the option of enriching themselves in the arms trade, or just indifferent to the fate of everyone else. I don't think MENSA has ever had a task force for achieving world peace. Their mission diverged very quickly into selling puzzle books and hosting weekend bashes for that portion of the 0.4% who are underachievers.

So how optimistic is Asimov's first law? Is it the MENSA of belief in an AI singularity?

Have we ever generated a technology about which we said, this must be built with unbreachable safeguards so that it will never be used to harm a human being? Hasn't the reverse been true? Hasn't the first question asked of every new technology, every new idea in fact, been: How can we use this to harm our enemies?

I am wondering whether we should not dread a fully-realized AI the way we would have dreaded the smashing of the atom if we'd known it was coming, because the outcome is necessarily a new and unimaginable phase in the war of survival among humans.

Now, I use the word 'dread' provisionally, because as I explained in the first post I don't think it is possible to achieve this with the way we currently conceptualize information. But it is conceivable that we will learn to conceptualize information differently, and though the depreciation problem will always be there, it is possible to imagine the world existing for a short period of time with physical capital (machines) being the only thing left that approximates human consciousness.

So allow me to ask what is the gut feeling of other posters about the ultimate outcome of a fully-realized AI. Are you optimistic about its impact on humans?
User avatar
Jnyusa
Mariner

 
Posts: 5934
Joined: Sun Jan 19, 2003 8:24 pm
Top

Re: Social Singularities

Postby Storyteller » Sun Feb 14, 2016 11:02 am

The First Law is unrealistic, as was recently proven by the ethical dilemma encountered by the developers of driverless cars - in the event of an unavoidable crash, should the car's AI prioritize the survivor of its specific car's occupant against all costs, or should it make a calculation that take into account the danger to the other car's passengers- which could theoretically lead it to sacrifice one person to save others.

In Asimov's later works, specifically the various sequels and prequels of The Foundation, he actually acknowledged that the First Law is self-contradictory and a kind of "reformation" occurs. It is revealed that the robots have evolved different interpretations of the First Law, and some have generalized it to establish the Zeroth Law that superceded the First Law - a robot may not harm humanity or through inaction allow it to come to harm. It followed that they can harm individuals to protect the society as a whole. At which point they realized that the only way to implement the Laws of Robotics in full was to take control of humanity, either as a benevolent dictatorship or through subtle aggregate manipulation. One robot is shown further extrapolating a Minus One Law - a robot may not harm sentience or by inaction allow it to come to harm.
"...Their aim in war with Germany is nothing more, nothing less than extermination of Hitlerism... There is absolutely no justification for this kind of war. The ideology of Hitlerism, just like any other ideological system, can be accepted or rejected, this is a matter of political views. But everyone grasps, that an ideology can not be exterminated by force, must not be finished off with a war.” - Vyacheslav Molotov, ""On the Foreign Policy of the Soviet Union", 31 October 1939
User avatar
Storyteller
Mariner
 
Posts: 7056
Joined: Sat Aug 31, 2002 7:46 am
Top

Re: Social Singularities

Postby Jnyusa » Mon Feb 15, 2016 9:07 pm

Yes, and the in the last book by Asimov that I read, which was part of some later series about robots, a robot actually commits suicide ... well, his circuits shut down ... because he can't resolve two competing human claims.

Just more evidence that humans occupy the throne of paradox in nature. :|
User avatar
Jnyusa
Mariner

 
Posts: 5934
Joined: Sun Jan 19, 2003 8:24 pm
Top

Re: Social Singularities

Postby hamlet » Tue Feb 16, 2016 7:44 am

Jnyusa wrote:Yes, and the in the last book by Asimov that I read, which was part of some later series about robots, a robot actually commits suicide ... well, his circuits shut down ... because he can't resolve two competing human claims.

Just more evidence that humans occupy the throne of paradox in nature. :|


Asimov is kind of a double edged sword in that regard. As I recall, his opinion was that Robots were just like any other tool humanity had ever developed: safe and useful as far as they go, but dangerous when applied/used incorrectly, both to people and the tool itself. One of his big points, I think, was that a set of absolutist, simplistic laws, no matter how much we think they are complete, can't cover all situations and can cause very dangerous conflicts and that the greatest tool we have is our own intellect/ability and that when we give over those tools to other modern conveniences, we are damaging ourselves. That was a major point throughout many of his stories and series'.

And, one might argue, he was rather prescient. One need only look at how many people can do so very much on their phone/tablet/computer, but the minute you ask them to do it without the aid of the computer, they're utterly lost and helpless. You ever want good comedy, go to the local library one day when the computer system is not working and watch all the helpless folks, including half the librarians, who can't find any books because nobody actually knows how to use the card catalog system anymore. I recently had this honor and it was enough to make me laugh to the point of nearly hurting myself.
User avatar
hamlet
Ringbearer
 
Posts: 10559
Joined: Sun Apr 29, 2001 12:01 pm
Top

Re: Social Singularities

Postby Jnyusa » Wed Feb 17, 2016 4:17 pm

hamlet wrote: Asimov is kind of a double edged sword in that regard. As I recall, his opinion was that Robots were just like any other tool humanity had ever developed: safe and useful as far as they go, but dangerous when applied/used incorrectly, both to people and the tool itself. One of his big points, I think, was that a set of absolutist, simplistic laws, no matter how much we think they are complete, can't cover all situations and can cause very dangerous conflicts and that the greatest tool we have is our own intellect/ability and that when we give over those tools to other modern conveniences, we are damaging ourselves. That was a major point throughout many of his stories and series'.


Yes, my sense of his written journey is that he was talking himself through the implications of tech advance in that direction, toward robotics and AI, and that his books reflect the evolution of his own thinking about the issue. He was certainly one of the living treasures of the 20th ce.

One need only look at how many people can do so very much on their phone/tablet/computer, but the minute you ask them to do it without the aid of the computer, they're utterly lost and helpless.


Well, and that's true for every technology that has been superseded. How many people today know how to churn butter or build a cold cellar or fledge an arrow? Within the social sciences there should probably be a discipline that devotes itself to discerning which skills must not be lost, and I don't know of anyone who is looking at that empirically. (There are plenty of alarmists but no theorists to hold hands with them!)
User avatar
Jnyusa
Mariner

 
Posts: 5934
Joined: Sun Jan 19, 2003 8:24 pm
Top

Re: Social Singularities

Postby Storyteller » Wed Feb 17, 2016 11:57 pm

Jnyusa wrote:Well, and that's true for every technology that has been superseded. How many people today know how to churn butter or build a cold cellar or fledge an arrow? Within the social sciences there should probably be a discipline that devotes itself to discerning which skills must not be lost, and I don't know of anyone who is looking at that empirically. (There are plenty of alarmists but no theorists to hold hands with them!)

One word: writing.

Writing by hand is about to become one of the disappearing "superseded" skills of the increasingly paperless digital world.

I was recently surprised by the realization of just how rarely I use writing - as opposed to typing - compared to the previous 5-10 years of my life. My sticky notes are not hanging on my fridge, they're in Google Keep. My phonebook is in my phone contacts. I send over 1000 emails per day but not a single handwritten letter in years. The only thing I hand-sign is credit card receipts and addresses on envelopes. Sometimes it takes me a couple of moments to dig into my memory and remember how to hand-write in Russian; it's a matter of time and a couple more small steps of technology distribution before Hebrew follows.

And I'm discovering lately that using a calculator is becoming a forgotten skill, too.

I think one "social singularity" is happening as we speak, by the way and that is technology vastly outpacing people's ability to adapt to it. Usability of new tech is about to become an escalating problem because technology is getting ever-smarter and people are not.

I could give plenty of examples from my work experience, because the vast majority of work of our customer service department is due to people's technological inadequacy. My guesstimate is that approximately two-thirds of call volume to our call center could have been saved if various computer and smartphone owners knew the bare basics of operating them. And it's not just, or mainly, the older generation; kids born with iPhones in hand are just as helpless. I've once spent 15 minutes in a call with a 20-something girl teaching her to open a PDF file attached to an email. I talk every single day to people who are willing to suffer waiting on the line with our advertising song playing over and over, just to ask us to send their airline ticket to "their other email" because the one they gave us is connected to their phone but not to their home computer (in other words, people who know neither how to add a second email to their computer nor how to forward an email with an attachment to another address). We've had people miss their flight because they didn't open their internet browser full-screen and got their flight time wrong. We've had people beg us to send their tickets by Whatsapp or by SMS because they never knew they could connect their email account to their iPhone.

And it really isn't a matter of age. My dad tracks his sugar levels by writing Excel graphs on his Android tablet, which has replaced his TV, his alarm clock, his laptop and pretty much everything else. He literally knows every corner of Android. And he is a 65 year old man who didn't have a personal computer until the age of 50. But the 25 year old co-worker sitting next to me had to take her phone to Cellcom lab just to mass-delete videos and photos (phone ran out of memory, had 30MB storage left out of 16GB). She had no idea that she had Dropbox pre-installed and set up on her phone, or that she could hook it up to WiFi. My boss, who is an ace at tech work on a computer, doesn't know how to use email on his iPhone.
"...Their aim in war with Germany is nothing more, nothing less than extermination of Hitlerism... There is absolutely no justification for this kind of war. The ideology of Hitlerism, just like any other ideological system, can be accepted or rejected, this is a matter of political views. But everyone grasps, that an ideology can not be exterminated by force, must not be finished off with a war.” - Vyacheslav Molotov, ""On the Foreign Policy of the Soviet Union", 31 October 1939
User avatar
Storyteller
Mariner
 
Posts: 7056
Joined: Sat Aug 31, 2002 7:46 am
Top

Re: Social Singularities

Postby hamlet » Thu Feb 18, 2016 7:38 am

Jnyusa wrote:
hamlet wrote:
One need only look at how many people can do so very much on their phone/tablet/computer, but the minute you ask them to do it without the aid of the computer, they're utterly lost and helpless.


Well, and that's true for every technology that has been superseded. How many people today know how to churn butter or build a cold cellar or fledge an arrow? Within the social sciences there should probably be a discipline that devotes itself to discerning which skills must not be lost, and I don't know of anyone who is looking at that empirically. (There are plenty of alarmists but no theorists to hold hands with them!)


Well, first off: I do know how to churn butter and dig a root cellar and have done both. The only reason I can't fletch an arrow is simply a lack of the motor skills - mine always end up looking like horribly mutated chicken legs for some reason.

Second, I wasn't referring to skills that were superseded by technological advance. After all, root/cold cellars are just not needed in modern Western society and, in fact, can be a drawback in some ways. What I am talking about is not knowing what it is you're actually doing and abrogating the entire task to a machine to do it for you. My example of the library folks is my best one. I'm not saying that everybody should know inherently how to read the card system, but being utterly clueless about what the cataloguing system used in a modern library and upon which the computer relies is . . . kind of dangerous on some level. Knowing that system is not a skill that's been surpassed by technology. The system is still there, still in use, still entirely current. It's just that it's been shuffled quietly into the background because it's easier to expect folks to look something up on their computer/phone/tablet and find directions that way. Same goes for basic mathematical skills. Most folks I know (myself included to a certain degree) couldn't perform even basic math without the aid of a calculator or an excel spreadsheet: I'm talking figuring out an appropriate tip at a restaurant, not algebra or trigonometry here. Hell, my wife recently got a lot of strange looks (virtually speaking) from family members because she hand wrote all the thank-you notes for our wedding and sent them out the old fashioned way (we were actually unaware that there was any other way). Most of the family asked why we didn't just send an email or a text instead! A least three of them complained that the "funny writing was hard to read." "Funny writing" = cursive script.

Essentially, we're handing away our actual skill and knowledge to computers and trust that they will simply do it for us. We're infantilizing ourselves on some level and when that electronic crutch is kicked out from under us, it's going to be a very rude surprise. This is not a Luddite tract here, just a comment that we're changing how we do things and that there are inherent dangers that come along with the benefits.


Storyteller: I agree with you to a large extent, though perhaps I'm old fashioned since my office is still utterly plastered in sticky notes and even my home grocery lists are always hand written. I think somewhere along the line I jumped off the technology train and said "that's it, I ain't goin' no further." I still hate my damned Kindle. Vaporware.
User avatar
hamlet
Ringbearer
 
Posts: 10559
Joined: Sun Apr 29, 2001 12:01 pm
Top

Re: Social Singularities

Postby Jnyusa » Thu Feb 18, 2016 10:55 pm

Storyteller wrote:Writing by hand is about to become one of the disappearing "superseded" skills of the increasingly paperless digital world. <snip> And I'm discovering lately that using a calculator is becoming a forgotten skill, too.


But I think there are some subtle distinctions here that need to be recalled. The important underlying skill involved in writing is not the fine motor skills that allow you to have nice penmanship but the mental capability to express sounds and ideas by means of abstract symbols.

As long as a language continues to be known and examples of the writing system used to express that language continue to be available for inspection, the penmanship could be relearned with little effort. Those written documents that are undecipherable (and there are only a few of them) are so because no one today knows what language was being used and the examplars are not numerous enough or lengthy enough for us to decode them. So there are instances where the ability to 'write' in that broad sense could be lost but the conditions for that happening are pretty narrow.

Calculators are just one form of machine-aided arithmetic and not a necessary form of themselves. The more important question is whether we would lose the ability to enumerate. And, like language, I think that is part of our hard-wiring and could not be lost unless we ourselves were extinct.

But you still have this question of bridging the gap between social states ... which can be seen pretty clearly in the case of lost languages. Many peoples can disappear and yet speak to us out of the dead past, but only if knowledge of their language is not completely lost. There do not seem to have been any peoples during human history who did not employ a symbol system of some kind, so in theory there is no inevitable forgetting, but in practice there are circumstances that can destroy the necessary bridges.

Storyteller wrote:I think one "social singularity" is happening as we speak, by the way and that is technology vastly outpacing people's ability to adapt to it. Usability of new tech is about to become an escalating problem because technology is getting ever-smarter and people are not . <snip> ... the vast majority of work of our customer service department is due to people's technological inadequacy.


Well, yes, I think the pace of technological advance versus the proportion of the population that can keep up with its effective use is a gap that might increase, even for the common products that you mention, Storyteller. However, the other subtle distinction that I think is important here is that you continue in your post to describe your own job as being one that bridges that gap. So the fact that society recognizes the gap and provides for a bridge means again that in theory there is no inevitable loss of control for humans in general, there are only pockets of discontinuity for particular people. The pockets may get bigger, of course, but they might also get smaller, depending on how determined the producers are to make everyone feel comfortable and satisifed and in control of their product.

The gap between tech advance and broad human understanding is much wider in highly technical fields, and it becomes concerning when it impinges on our social standards or our ethics. It's imaginable that those who understand and control advanced technologies could 'run away' from the rest of us and create things that we don't want but don't understand well enough to get rid of them.

hamlet wrote: Second, I wasn't referring to skills that were superseded by technological advance. After all, root/cold cellars are just not needed in modern Western society and, in fact, can be a drawback in some ways.


Mmm, hamlet, I think this misses the point that I had tried to make in the same way that Storyteller's example sort of misses the point. The reason that we no longer need cold cellars is because we have refrigerators. But we still need to keep food cold in the summer. We need to put meat and other foods on the table. We need to keep warm in the winter. We need to dispose of our waste in a non-infectious way, etc. The question I was asking is whether we would be able to provide for those needs under any and all circumstances. What older methods are necessary to retain, in other words, because the task they accomplished will always be relevant but our current technology might not always be available?

I wasn't proposing an exhaustive list but simply observing that there is currently no empirical approach to making this determination (that I know of). And it wouldn't begin with a list of technologies, it would begin with a list of needs, and then compare those needs to resources that would be available for 'always' in order to determine which technologies are the most potentially useful to preserve even if they are out of date. I was simply observing that no one other than the paranoid survivalists seem to be asking these questions, but they are not unimportant questions and they probably deserve to be addressed empirically.

To go back to Storyteller's example of writing, if humans mutated in such a way that language was no longer one of our capabilities, that would create an obvious unbridgeable singularity. Or, going in the other direction, if humans mutated to have ESP so that future generations communicated by means of thought alone, that would also be a singularity because those of us living now would not be able to communicate with them. No bridge. But these are extreme cases and don't involve technology in any event. These are not the things we are worried about, and there's nothing we could do to cause or prevent them anyway.

But we have other proposed singularities that arise from our own technological development and that impact our relationship to other humans, to our common or intersecting histories, to our own social past, to our planet, and so on, and sometimes these result in passable bridges and sometimes they do not. What things need to be preserved in order to survive such singularities? It's sort of like asking: What has to be in the library? As opposed to asking what kind of card catalog we would use.
User avatar
Jnyusa
Mariner

 
Posts: 5934
Joined: Sun Jan 19, 2003 8:24 pm
Top

Re: Social Singularities

Postby hamlet » Fri Feb 19, 2016 6:48 am

JN: You're missing my point about the libraries. It's not that we've suddenly gotten past the cataloging system, it's that it's the system we still use all but universally . . . and without the aid of computers to hand hold us, so many of us are utterly helpless in the face of it now to the point of utter dysfunction. It's not a tool we've gotten past like cold cellars, it's just one we' still use and don't actually fathom any longer. The computer adaptation of it has absolved us in most circumstances of ever having to think about it. We use it without understanding its underlying logic.

It's a bit like knowing how to add using a calculator, but not being able to perform even the most basic arithmetic without one.
User avatar
hamlet
Ringbearer
 
Posts: 10559
Joined: Sun Apr 29, 2001 12:01 pm
Top

Re: Social Singularities

Postby Jnyusa » Fri Feb 19, 2016 2:48 pm

Hamlet, I do get what you are saying. (I think so, anyway). I am asking a different question about the same evolution.

It might be unimportant in the big scheme of things that people know how to use a card catalog system. I don't know if it is or if it isn't, not without some investigation. But I can be certain that if the library disappears, knowing how to use the card catalog will not have any importance whatsoever.

We can think of the human knowledge base as a library of all sorts of things, some more important than others. Is it more important to know how to use a card catalog or how to do basic math? My guess is that basic math is more important, so if there were a capability that I wanted to preserve, and preservation was not cost-free, I would put my resources into our math capability before I would put resources into retaining knowledge about card catalogs. You can disagree, of course, and say that card catalogs are more important. But that's the discussion that need to take place, I think. That's the investigation I was trying to describe. Before we care about the loss of a capability, and commit resources to avoiding that loss, we need to have some idea of the relative importance of the capability.
User avatar
Jnyusa
Mariner

 
Posts: 5934
Joined: Sun Jan 19, 2003 8:24 pm
Top

Re: Social Singularities

Postby Storyteller » Sat Feb 20, 2016 12:27 pm

Jnyusa wrote:
Storyteller wrote:Writing by hand is about to become one of the disappearing "superseded" skills of the increasingly paperless digital world. <snip> And I'm discovering lately that using a calculator is becoming a forgotten skill, too.


But I think there are some subtle distinctions here that need to be recalled. The important underlying skill involved in writing is not the fine motor skills that allow you to have nice penmanship but the mental capability to express sounds and ideas by means of abstract symbols.

As long as a language continues to be known and examples of the writing system used to express that language continue to be available for inspection, the penmanship could be relearned with little effort. Those written documents that are undecipherable (and there are only a few of them) are so because no one today knows what language was being used and the examplars are not numerous enough or lengthy enough for us to decode them. So there are instances where the ability to 'write' in that broad sense could be lost but the conditions for that happening are pretty narrow.

I wouldn't be so sure.

I believe there are studies to back it up, but handwriting is more than just expressing ideas by means of abstract symbols. There is brainwork involved that isn't involved in typing. Handwriting impacts idea composition, memory and cognition in general. There's something about expressing ideas in symbols made of sequential strokes, as opposed to thumbing buttons, that makes you smarter.

Enumeraring... One of the mathematical tasks that frequently crop up in the work of an Israeli travel agent is calculating the cancellation fee under the Consumer Protection law. The fee is either NIS 100 per person per product or 5% of total, by the lowest. It's a simple and straightforward middle school math exercise; 100 is 5% of 2000, so for any product with value above NIS 2000 per person the fee is NIS 100; if the value is lower than NIS 2000 the fee is 5%. It takes me a single glimpse at a reservation to know which fee to charge. Most people in my department struggle with it and take the long route by calculating 5% and comparing. Some can't do calculations as simple as 1735 - 200 = ? How these people manage their finances I have no idea.


Storyteller wrote:Well, yes, I think the pace of technological advance versus the proportion of the population that can keep up with its effective use is a gap that might increase, even for the common products that you mention, Storyteller. However, the other subtle distinction that I think is important here is that you continue in your post to describe your own job as being one that bridges that gap. So the fact that society recognizes the gap and provides for a bridge means again that in theory there is no inevitable loss of control for humans in general, there are only pockets of discontinuity for particular people. The pockets may get bigger, of course, but they might also get smaller, depending on how determined the producers are to make everyone feel comfortable and satisfied and in control of their product.

I think you underestimate the pace at which the gap is growing.
"...Their aim in war with Germany is nothing more, nothing less than extermination of Hitlerism... There is absolutely no justification for this kind of war. The ideology of Hitlerism, just like any other ideological system, can be accepted or rejected, this is a matter of political views. But everyone grasps, that an ideology can not be exterminated by force, must not be finished off with a war.” - Vyacheslav Molotov, ""On the Foreign Policy of the Soviet Union", 31 October 1939
User avatar
Storyteller
Mariner
 
Posts: 7056
Joined: Sat Aug 31, 2002 7:46 am
Top

Re: Social Singularities

Postby hamlet » Mon Feb 22, 2016 11:17 am

Jnyusa wrote:Hamlet, I do get what you are saying. (I think so, anyway). I am asking a different question about the same evolution.

It might be unimportant in the big scheme of things that people know how to use a card catalog system. I don't know if it is or if it isn't, not without some investigation. But I can be certain that if the library disappears, knowing how to use the card catalog will not have any importance whatsoever.

We can think of the human knowledge base as a library of all sorts of things, some more important than others. Is it more important to know how to use a card catalog or how to do basic math? My guess is that basic math is more important, so if there were a capability that I wanted to preserve, and preservation was not cost-free, I would put my resources into our math capability before I would put resources into retaining knowledge about card catalogs. You can disagree, of course, and say that card catalogs are more important. But that's the discussion that need to take place, I think. That's the investigation I was trying to describe. Before we care about the loss of a capability, and commit resources to avoiding that loss, we need to have some idea of the relative importance of the capability.


The thing isn't necessarily which is more valuable between basic math and a card catalog. Yes, given the choice between the two, I'd rather work to preserve basic arithmetic skills (which we are not really).

However, it's more along the lines of actually making use of a system and not understanding how it works in the first place in even the slightest way. A bit like how a store can and will come to a screeching halt without power to run the cash registers because nobody is capable of doing the basic math required to complete a transaction (ignoring for just a moment issues of stock tracking etc.). Or trying to get your desktop computer to work by smacking the monitor (a thing I'm sure many of us are guilty of at some point even knowing for a fact that the monitor does only one thing and isn't responsible for what's likely wrong with the computer in the first place).

I would also talk about vanishing skills a little more broadly. Like cooking. No, seriously. My wife doesn't know how to cook. At all. In fact, I've forbidden her to ever touch my kitchen knives for fear that she'll sever a finger or open a gash in herself that would be fatal. But, seriously, how in the world do you graduate from college at least without having picked up some very rudimentary cooking skills to the point where you can at least heat up a can of chicken soup without setting something on fire? Isn't that something you simply learn at least by observing a parent or guardian in the kitchen? Or by simple necessity? But I'm finding out through various conversations that it's actually a growing phenomenon: folks who are otherwise highly capable professionals have never EVER cooked in their life and wouldn't even know how to begin. And it's something that's entirely supported by a modern, especially urban, environment. If you can afford it, you never have to cook simply because take out is ubiquitous and you never even have to glance at your kitchen. Hell, I know a certain New Yorker who has a state of the art kitchen equipped with "to die for" tools . . . that he never uses and all of it is simply for show. :?

The same, I suppose, goes for sewing - another one of those basic life skills for a long time clothes being to expensive to just replace out of hand. Basic home repair tasks - of which I am rather guilty myself. Being able to maintain your own vehicle - which I will equally attribute to the vastly increased complexity of such things. Etc.

This is a long, roundabout way of saying a couple of things.

1) That the nature of our current society means that a lot of what we used to call invaluable skills are no longer invaluable or even common or at all necessary.

2) That the changes involved has as much or more to do with basic social density and culture than it does with actual technology. Putting several million people into a space only 10 or so miles across and even less so in some dimensions suddenly drastically changes basic life skill requirements.
User avatar
hamlet
Ringbearer
 
Posts: 10559
Joined: Sun Apr 29, 2001 12:01 pm
Top

Re: Social Singularities

Postby Storyteller » Mon Feb 22, 2016 1:14 pm

hamlet wrote:However, it's more along the lines of actually making use of a system and not understanding how it works in the first place in even the slightest way.

Which is increasingly a problem everywhere. And the disappearance of seemingly irrelevant skills like using library catalogs leaves us ill equipped to deal with more complex systems, for which library catalogs make good analogies (it's much easier to understand file management on a computer when you are familiar with a file management system for actual paper files. Visualization works like nothing else).

I would also talk about vanishing skills a little more broadly. Like cooking. No, seriously. My wife doesn't know how to cook. At all. In fact, I've forbidden her to ever touch my kitchen knives for fear that she'll sever a finger or open a gash in herself that would be fatal. But, seriously, how in the world do you graduate from college at least without having picked up some very rudimentary cooking skills to the point where you can at least heat up a can of chicken soup without setting something on fire? Isn't that something you simply learn at least by observing a parent or guardian in the kitchen? Or by simple necessity? But I'm finding out through various conversations that it's actually a growing phenomenon: folks who are otherwise highly capable professionals have never EVER cooked in their life and wouldn't even know how to begin. And it's something that's entirely supported by a modern, especially urban, environment. If you can afford it, you never have to cook simply because take out is ubiquitous and you never even have to glance at your kitchen. Hell, I know a certain New Yorker who has a state of the art kitchen equipped with "to die for" tools . . . that he never uses and all of it is simply for show. :?

That has to do with the decline of family as the basic social unit. You don't learn to cook by observing others cook; your parents take the time to teach you to cook, and pass family recipes on to you. And it's not a given in modern families accustomed to takeouts, eating in restaurants and microwaved food. My mom can only make a handful of my grandmother's signature recipes; she was extremely proud of herself recently when she pulled off the amazing, airy-textured cake of my childhood that is known in our family as "pooh" (don't ask). I have a reputation in my family as a great cook simply because I have the patience to work with recipes that demand pre-cooking of some ingredients; it boggles my mom's mind that cooking can be done in stages.

The same, I suppose, goes for sewing - another one of those basic life skills for a long time clothes being to expensive to just replace out of hand.

Modern materials and fabrics lend themselves more and more poorly to improvised repairs.
"...Their aim in war with Germany is nothing more, nothing less than extermination of Hitlerism... There is absolutely no justification for this kind of war. The ideology of Hitlerism, just like any other ideological system, can be accepted or rejected, this is a matter of political views. But everyone grasps, that an ideology can not be exterminated by force, must not be finished off with a war.” - Vyacheslav Molotov, ""On the Foreign Policy of the Soviet Union", 31 October 1939
User avatar
Storyteller
Mariner
 
Posts: 7056
Joined: Sat Aug 31, 2002 7:46 am
Top

Re: Social Singularities

Postby hamlet » Mon Feb 22, 2016 1:37 pm

Storyteller wrote:That has to do with the decline of family as the basic social unit. You don't learn to cook by observing others cook; your parents take the time to teach you to cook, and pass family recipes on to you. And it's not a given in modern families accustomed to takeouts, eating in restaurants and microwaved food. My mom can only make a handful of my grandmother's signature recipes; she was extremely proud of herself recently when she pulled off the amazing, airy-textured cake of my childhood that is known in our family as "pooh" (don't ask). I have a reputation in my family as a great cook simply because I have the patience to work with recipes that demand pre-cooking of some ingredients; it boggles my mom's mind that cooking can be done in stages.



I was never actually taught. I watched my mother cook for many years. Then, one day, she pointed at me and said "I'm tired, you cook supper" and ever since, I just made it happen. Eventually, I got better at some aspects of cooking than she did, though she can cook me under the table in some ways. My technical skill probably ranks very low, but the willingness and ability to just DO IT makes up for a lot in daily cooking, and the willingness to try stuff.

That, and watching Alton Brown. Dude taught me a lot.
User avatar
hamlet
Ringbearer
 
Posts: 10559
Joined: Sun Apr 29, 2001 12:01 pm
Top

Re: Social Singularities

Postby Faramond » Thu Mar 03, 2016 11:08 am

This is an interesting subject, and I hope to have time to come back to it later, but I'm afraid now I'm going to be THAT person who just comes in to correct something.

Well, to me it's a pretty important correction to make because it really gives a completely wrong view of who Isaac Asimov was.

Jnyusa wrote:Asimov was an optimist, he really was. He founded MENSA believing that a group of geniuses could achieve world peace ... that was in fact his purpose in creating MENSA ... and he took for granted the idea that geniuses would want to achieve world peace. That they wouldn't be more attracted by the option of enriching themselves in the arms trade, or just indifferent to the fate of everyone else. I don't think MENSA has ever had a task force for achieving world peace. Their mission diverged very quickly into selling puzzle books and hosting weekend bashes for that portion of the 0.4% who are underachievers.


Mensa was not founded by Asimov! I'm sort of baffled that you think this, or where you're getting this stuff about world peace.

wikipedia wrote:Roland Berrill, an Australian barrister, and Dr. Lancelot Ware, a British scientist and lawyer, founded Mensa at Lincoln College, in Oxford, England, in 1946.


Just for the record, Mensa is not an acronym, and I don't think they usually write it in all caps. Mensa is the Latin word for 'table', apparently.

Asimov was a member of Mensa, that is true. Here is a quote I found out in cyberspace about it:

...[But] I took the test, scored high, and became a member of Mensa.
It was not on the whole, a happy experience. I met a number of wonderful Mensans, but there were other Mensans who were brain-proud and aggressive about their IQs, who, one got the impression, would like, on being introduced, to be able to say, ’I’m Joe Doakes, and my IQ is 172,’ or, perhaps, have the figure tattooed on their forehead. They were, as I had been in my youth, forcing their intelligence on unwilling victims. In general, too, they felt underappreciated and undersuccessful. As a result, they had soured on the Universe and tended to be disagreeable.

What’s more, they were constantly jousting with each other, testing their intelligence on each other, and that sort of thing becomes wearing after awhile.

Furthermore, I became uncomfortably aware that Mensans, however high their paper IQ might be, were likely to be as irrational as anyone else. Many of them believed themselves to be part of a ‘superior’ group that ought to rule the world, and despised non-Mensans as inferiors. Naturally, they tended to be right-wing conservatives, and I generally feel terribly out of sympathy with such views.

Worse yet, there were groups among them, I found out eventually, who accepted astrology and many other pseudoscientific beliefs, and who formed ‘SIGs’ (‘special interest groups’) devoted to different varieties of intellectual trash. Where was the credit of being associated with that sort of thing, even tangentially?

... I stayed on in Mensa for years, getting more and more tired of it. ... Eventually, after both Marvin and Margot [two of the New York Mensans he acknowledged as "delightful and intelligent"] had died, I did resign.

- Isaac Asimov, I. ASIMOV: A MEMOIR
User avatar
Faramond
Ranger of the North

 
Posts: 1366
Joined: Mon Aug 25, 2003 11:06 pm
Top

Re: Social Singularities

Postby Faramond » Thu Mar 03, 2016 3:59 pm

One more post of being argumentative, and then I'll try something more cooperative.

Jnyusa wrote:And you can't define a singularity tautologically. You can't just say, well, if it didn't create an unbridgeable chasm then it wasn't a singularity.


But that is exactly the definition of singularity! No information can get across. None of these things being talked about are true singularities. Look, singularity comes from physics, right? A black hole? No information survives that. The mass survives, but that's it. The way it was constructed is lost.

To me the essential part of any kind of singularity is that what is on the other side is essentially unknowable.

Jnysua wrote:Machines, whatever their self-learning capability, are made out of non-living resources. They don't regenerate naturally. They depreciate.


Living organic creatures depreciate as well. How do they get around this problem? Replication. The same might well hold for AI creatures in the future. What if they learn to recreate themselves out of raw materials?

Jnyusa wrote:And the great, superlative advantage of mutation and natural selection is that they are not structured as binary systems. They rely, as near as we can figure, on randomness. And randomness is infinitely richer than binary choice.


But organic mutation and natural selection is built upon the foundation of DNA, which is, essentially, binary. I mean, you can write out any DNA sequence as a binary string easily, since there are only four nucleotides.

The randomness you're talking about comes in, I suppose, in imperfections in the replication of DNA. It's random what part of the DNA isn't copied properly. The new copy may lead to junk results most of the time, but it might, in some circumstances, lead to a beneficial change. You could certainly set up an AI system where the behavior of the *creature* is governed by a binary string ( like its DNA ) and then when it replicated random bits were switched around to simulate mutations.

Paradoxically, what randomness allows is endless failure. And for that to result ultimately in success, you need lots and lots of time. We could, probably, maybe?, program a non-regenerating physical system to self-teach and evolve chaotically, but neither depreciation nor human investment parameters could withstand the millions of trials and waiting time, with no outcome certain, required for a chaotic system to come up with a 'desirable' outcome.


You need lots and lots of *computing* time. Computers are getting faster all the time. Evolution can happen on a much faster pace in a computer than it can in the real world.
User avatar
Faramond
Ranger of the North

 
Posts: 1366
Joined: Mon Aug 25, 2003 11:06 pm
Top

Next

Return to Philosophy: Councils of Manwë

Who is online

Users browsing this forum: No registered users and 1 guest