independent and unofficial
Prince fan community site
Tue 12th Dec 2017 7:44am
Welcome! Sign up or enter username and password to remember me
Forum jump
Forums > Politics & Religion > The impeachment of Donald Trump: how likely is it?
« Previous topic  Next topic »
Page 9 of 15 « First<5678910111213>Last »
  New topic   Printable     (Log in to 'subscribe' to this topic)
Reply #240 posted 06/10/17 10:44pm

QueenofCardboa
rd

avatar


John McCain must have had a stroke.

"I could stand in the middle of 5th Avenue and shoot somebody and I wouldn't lose voters," Donald Trump
  - E-mail - orgNote - Report post to moderator
Reply #241 posted 06/11/17 2:40am

2elijah

avatar

OnlyNDaUsa said:



Horsefeathers said:


Oh, only. Never change. lol



I will never change going my own way and coming up with my own opinions and will not start getting my views fed to me by anyone.
I will never change telling the truth
I will never change pointing out the double standards
I will never change being willing to call you and others out for making up or repeating lies about me...


Yet you try to twist others' opinions to fit yours, when they don't agree with you. Lol.
'Trump voters got Hoodwinked by Trump' popcorn coke
  - E-mail - orgNote - Report post to moderator
Reply #242 posted 06/11/17 4:36am

SuperFurryAnim
al

avatar

How likely is it Trump is eating peach jelly for breakfast?

You better Watch Out! I'm a WAR MACHINE!
  - E-mail - orgNote - Report post to moderator
Reply #243 posted 06/11/17 10:56am

QueenofCardboa
rd

avatar


OnlyNDaUsa said:


Horsefeathers said:


Oh, only.

Never change. lol



I will never change going my own way and coming up with my own opinions and will not start getting my views fed to me by anyone.

I will never change telling the truth

I will never change pointing out the double standards

I will never change being willing to call you and others out for making up or repeating lies about me...


Hi ONLY wave

Why Facts Don’t Change Our Minds

New discoveries about the human mind show the limitations of reason.

By Elizabeth Kolbert

The vaunted human capacity for reason may have more to do with winning arguments than with thinking straight.


In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide.

They were presented with pairs of suicide notes.

In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life.

The students were then asked to distinguish between the genuine notes and the fake ones.


Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times.

Others discovered that they were hopeless.

They identified the real note in only ten instances.


As is often the case with psychological studies, the whole setup was a put-on.

Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious.

The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.


In the second phase of the study, the deception was revealed.

The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong.

(This, it turned out, was also a deception.)

Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right.

At this point, something curious happened.

The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this.

Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.


“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”


A few years later, a new set of Stanford students was recruited for a related study.

The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive.

George had a small son and played golf.

The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test.

According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option.

In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times.

Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious.

The students were then asked to describe their own beliefs.

What sort of attitude toward risk did they think a successful firefighter would have?

The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.


Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted.

In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.


The Stanford studies became famous.

Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking.

It isn’t any longer.

Thousands of subsequent experiments have confirmed (and elaborated on) this finding.

As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational.

Rarely has this insight seemed more relevant than it does right now.

Still, an essential puzzle remains: How did we come to be this way?


In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question.

Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision.

It emerged on the savannas of Africa, and has to be understood in that context.


Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coöperate.

Coöperation is difficult to establish and almost as difficult to sustain.

For any individual, freeloading is always the best course of action.

Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.


“Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” Mercier and Sperber write.

Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.


Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them.

Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments.

One of the most famous of these was conducted, again, at Stanford.

For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment.

Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.


The students were asked to respond to two studies.

One provided data in support of the deterrence argument, and the other provided data that called it into question.

Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics.

The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse.

At the end of the experiment, the students were asked once again about their views.

Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.


If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias.

Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do.

Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner.

To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against.

The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”


Mercier and Sperber prefer the term “myside bias.”

Humans, they point out, aren’t randomly credulous.

Presented with someone else’s argument, we’re quite adept at spotting the weaknesses.

Almost invariably, the positions we’re blind about are our own.


A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry.

Participants were asked to answer a series of simple reasoning problems.

They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes.

The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.


In step three, participants were shown one of the same problems, along with their answer and the answer of another participant, who’d come to a different conclusion.

Once again, they were given the chance to change their responses.

But a trick had been played: the answers presented to them as someone else’s were actually their own, and vice versa.

About half the participants realized what was going on.

Among the other half, suddenly people became a lot more critical.

Nearly sixty per cent now rejected the responses that they’d earlier been satisfied with.


“Thanks again for coming—I usually find these office parties rather awkward.”


This lopsidedness, according to Mercier and Sperber, reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group.

Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave.

There was little advantage in reasoning clearly, while much was to be gained from winning arguments.


Among the many, many issues our forebears didn’t worry about were the deterrent effects of capital punishment and the ideal attributes of a firefighter.

Nor did they have to contend with fabricated studies, or fake news, or Twitter.

It’s no wonder, then, that today reason often seems to fail us.

As Mercier and Sperber write, “This is one of many cases in which the environment changed too quickly for natural selection to catch up.”


Steven Sloman, a professor at Brown, and Philip Fernbach, a professor at the University of Colorado, are also cognitive scientists.

They, too, believe sociability is the key to how the human mind functions or, perhaps more pertinently, malfunctions.

They begin their book, “The Knowledge Illusion: Why We Never Think Alone” (Riverhead), with a look at toilets.


Virtually everyone in the United States, and indeed throughout the developed world, is familiar with toilets.

A typical flush toilet has a ceramic bowl filled with water.

When the handle is depressed, or the button pushed, the water—and everything that’s been deposited in it—gets sucked into a pipe and from there into the sewage system.

But how does this actually happen?


In a study conducted at Yale, graduate students were asked to rate their understanding of everyday devices, including toilets, zippers, and cylinder locks.

They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again.

Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped.

(Toilets, it turns out, are more complicated than they appear.)


Sloman and Fernbach see this effect, which they call the “illusion of explanatory depth,” just about everywhere.

People believe that they know way more than they actually do. What allows us to persist in this belief is other people.

In the case of my toilet, someone else designed it so that I can operate it easily.

This is something humans are very good at.

We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history.

So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins.


“One implication of the naturalness with which we divide cognitive labor,” they write, is that there’s “no sharp boundary between one person’s ideas and knowledge” and “those of other members” of the group.


This borderlessness, or, if you prefer, confusion, is also crucial to what we consider progress.

As people invented new tools for new ways of living, they simultaneously created new realms of ignorance; if everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much.

When it comes to new technologies, incomplete understanding is empowering.


Where it gets us into trouble, according to Sloman and Fernbach, is in the political domain.

It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about.

Sloman and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea.

Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map.

The farther off base they were about the geography, the more likely they were to favor military intervention.

(Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.)


Surveys on many other issues have yielded similarly dismaying results.

“As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write.

And here our dependence on other minds reinforces the problem.

If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless.

When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views.

If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.


“This is how a community of knowledge can become dangerous,” Sloman and Fernbach observe.

The two have performed their own version of the toilet experiment, substituting public policy for household gadgets.

In a study conducted in 2012, they asked people for their stance on questions like: Should there be a single-payer health-care system?

Or merit-based pay for teachers?

Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals.

Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one.

Most people at this point ran into trouble.

Asked once again to rate their views, they ratcheted down the intensity, so that they either agreed or disagreed less vehemently.


Sloman and Fernbach see in this result a little candle for a dark world.

If we—or our friends or the pundits on CNN—spent less time pontificating and more trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views.

This, they write, “may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes.”


One way to look at science is as a system that corrects for people’s natural inclinations.

In a well-run laboratory, there’s no room for myside bias; the results have to be reproducible in other laboratories, by researchers who have no motive to confirm them.

And this, it could be argued, is why the system has proved so successful.

At any given moment, a field may be dominated by squabbles, but, in the end, the methodology prevails.

Science moves forward, even as we remain stuck in place.


In “Denying to the Grave: Why We Ignore the Facts That Will Save Us” (Oxford), Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, probe the gap between what science tells us and what we tell ourselves.

Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly, like the conviction that vaccines are hazardous.

Of course, what’s hazardous is not being vaccinated; that’s why vaccines were created in the first place.

“Immunization is one of the triumphs of modern medicine,” the Gormans note.

But no matter how many scientific studies conclude that vaccines are safe, and that there’s no link between immunizations and autism, anti-vaxxers remain unmoved.

(They can now count on their side—sort of—Donald Trump, who has said that, although he and his wife had their son, Barron, vaccinated, they refused to do so on the timetable recommended by pediatricians.)


The Gormans, too, argue that ways of thinking that now seem self-destructive must at some point have been adaptive.

And they, too, dedicate many pages to confirmation bias, which, they claim, has a physiological component.

They cite research suggesting that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs.

“It feels good to ‘stick to our guns’ even if we are wrong,” they observe.


The Gormans don’t just want to catalogue the ways we go wrong; they want to correct for them.

There must be some way, they maintain, to convince people that vaccines are good for kids, and handguns are dangerous.

(Another widespread but statistically insupportable belief they’d like to discredit is that owning a gun makes you safer.)

But here they encounter the very problems they have enumerated.

Providing people with accurate information doesn’t seem to help; they simply discount it.

Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science.

“The challenge that remains,” they write toward the end of their book, “is to figure out how to address the tendencies that lead to false scientific belief.”


“The Enigma of Reason,” “The Knowledge Illusion,” and “Denying to the Grave” were all written before the November election.

And yet they anticipate Kellyanne Conway and the rise of “alternative facts.”

These days, it can feel as if the entire country has been given over to a vast psychological experiment being run either by no one or by Steve Bannon.

Rational agents would be able to think their way to a solution.

But, on this matter, the literature is not reassuring.

Elizabeth Kolbert has been a staff writer at The New Yorker since 1999.

She won the 2015 Pulitzer Prize for general nonfiction for “The Sixth Extinction: ...istory.”







[Edited 6/11/17 11:03am]

"I could stand in the middle of 5th Avenue and shoot somebody and I wouldn't lose voters," Donald Trump
  - E-mail - orgNote - Report post to moderator
Reply #244 posted 06/11/17 11:13am

2freaky4church
1

avatar

We doth need to be careful.

DJ is da man
"2freaky is very down." 2Elijah.
"2freaky convinced me to join Antifa: OnlyNDA
  - E-mail - orgNote - Report post to moderator
Reply #245 posted 06/11/17 2:53pm

214

QueenofCardboard said:


OnlyNDaUsa said:



I will never change going my own way and coming up with my own opinions and will not start getting my views fed to me by anyone.

I will never change telling the truth

I will never change pointing out the double standards

I will never change being willing to call you and others out for making up or repeating lies about me...


Hi ONLY wave

Why Facts Don’t Change Our Minds

New discoveries about the human mind show the limitations of reason.

By Elizabeth Kolbert

The vaunted human capacity for reason may have more to do with winning arguments than with thinking straight.


In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide.

They were presented with pairs of suicide notes.

In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life.

The students were then asked to distinguish between the genuine notes and the fake ones.


Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times.

Others discovered that they were hopeless.

They identified the real note in only ten instances.


As is often the case with psychological studies, the whole setup was a put-on.

Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious.

The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.


In the second phase of the study, the deception was revealed.

The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong.

(This, it turned out, was also a deception.)

Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right.

At this point, something curious happened.

The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this.

Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.


“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”


A few years later, a new set of Stanford students was recruited for a related study.

The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive.

George had a small son and played golf.

The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test.

According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option.

In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times.

Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious.

The students were then asked to describe their own beliefs.

What sort of attitude toward risk did they think a successful firefighter would have?

The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.


Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted.

In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.


The Stanford studies became famous.

Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking.

It isn’t any longer.

Thousands of subsequent experiments have confirmed (and elaborated on) this finding.

As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational.

Rarely has this insight seemed more relevant than it does right now.

Still, an essential puzzle remains: How did we come to be this way?


In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question.

Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision.

It emerged on the savannas of Africa, and has to be understood in that context.


Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coöperate.

Coöperation is difficult to establish and almost as difficult to sustain.

For any individual, freeloading is always the best course of action.

Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.


“Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” Mercier and Sperber write.

Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.


Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them.

Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments.

One of the most famous of these was conducted, again, at Stanford.

For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment.

Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.


The students were asked to respond to two studies.

One provided data in support of the deterrence argument, and the other provided data that called it into question.

Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics.

The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse.

At the end of the experiment, the students were asked once again about their views.

Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.


If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias.

Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do.

Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner.

To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against.

The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”


Mercier and Sperber prefer the term “myside bias.”

Humans, they point out, aren’t randomly credulous.

Presented with someone else’s argument, we’re quite adept at spotting the weaknesses.

Almost invariably, the positions we’re blind about are our own.


A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry.

Participants were asked to answer a series of simple reasoning problems.

They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes.

The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.


In step three, participants were shown one of the same problems, along with their answer and the answer of another participant, who’d come to a different conclusion.

Once again, they were given the chance to change their responses.

But a trick had been played: the answers presented to them as someone else’s were actually their own, and vice versa.

About half the participants realized what was going on.

Among the other half, suddenly people became a lot more critical.

Nearly sixty per cent now rejected the responses that they’d earlier been satisfied with.


“Thanks again for coming—I usually find these office parties rather awkward.”


This lopsidedness, according to Mercier and Sperber, reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group.

Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave.

There was little advantage in reasoning clearly, while much was to be gained from winning arguments.


Among the many, many issues our forebears didn’t worry about were the deterrent effects of capital punishment and the ideal attributes of a firefighter.

Nor did they have to contend with fabricated studies, or fake news, or Twitter.

It’s no wonder, then, that today reason often seems to fail us.

As Mercier and Sperber write, “This is one of many cases in which the environment changed too quickly for natural selection to catch up.”


Steven Sloman, a professor at Brown, and Philip Fernbach, a professor at the University of Colorado, are also cognitive scientists.

They, too, believe sociability is the key to how the human mind functions or, perhaps more pertinently, malfunctions.

They begin their book, “The Knowledge Illusion: Why We Never Think Alone” (Riverhead), with a look at toilets.


Virtually everyone in the United States, and indeed throughout the developed world, is familiar with toilets.

A typical flush toilet has a ceramic bowl filled with water.

When the handle is depressed, or the button pushed, the water—and everything that’s been deposited in it—gets sucked into a pipe and from there into the sewage system.

But how does this actually happen?


In a study conducted at Yale, graduate students were asked to rate their understanding of everyday devices, including toilets, zippers, and cylinder locks.

They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again.

Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped.

(Toilets, it turns out, are more complicated than they appear.)


Sloman and Fernbach see this effect, which they call the “illusion of explanatory depth,” just about everywhere.

People believe that they know way more than they actually do. What allows us to persist in this belief is other people.

In the case of my toilet, someone else designed it so that I can operate it easily.

This is something humans are very good at.

We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history.

So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins.


“One implication of the naturalness with which we divide cognitive labor,” they write, is that there’s “no sharp boundary between one person’s ideas and knowledge” and “those of other members” of the group.


This borderlessness, or, if you prefer, confusion, is also crucial to what we consider progress.

As people invented new tools for new ways of living, they simultaneously created new realms of ignorance; if everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much.

When it comes to new technologies, incomplete understanding is empowering.


Where it gets us into trouble, according to Sloman and Fernbach, is in the political domain.

It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about.

Sloman and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea.

Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map.

The farther off base they were about the geography, the more likely they were to favor military intervention.

(Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.)


Surveys on many other issues have yielded similarly dismaying results.

“As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write.

And here our dependence on other minds reinforces the problem.

If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless.

When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views.

If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.


“This is how a community of knowledge can become dangerous,” Sloman and Fernbach observe.

The two have performed their own version of the toilet experiment, substituting public policy for household gadgets.

In a study conducted in 2012, they asked people for their stance on questions like: Should there be a single-payer health-care system?

Or merit-based pay for teachers?

Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals.

Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one.

Most people at this point ran into trouble.

Asked once again to rate their views, they ratcheted down the intensity, so that they either agreed or disagreed less vehemently.


Sloman and Fernbach see in this result a little candle for a dark world.

If we—or our friends or the pundits on CNN—spent less time pontificating and more trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views.

This, they write, “may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes.”


One way to look at science is as a system that corrects for people’s natural inclinations.

In a well-run laboratory, there’s no room for myside bias; the results have to be reproducible in other laboratories, by researchers who have no motive to confirm them.

And this, it could be argued, is why the system has proved so successful.

At any given moment, a field may be dominated by squabbles, but, in the end, the methodology prevails.

Science moves forward, even as we remain stuck in place.


In “Denying to the Grave: Why We Ignore the Facts That Will Save Us” (Oxford), Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, probe the gap between what science tells us and what we tell ourselves.

Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly, like the conviction that vaccines are hazardous.

Of course, what’s hazardous is not being vaccinated; that’s why vaccines were created in the first place.

“Immunization is one of the triumphs of modern medicine,” the Gormans note.

But no matter how many scientific studies conclude that vaccines are safe, and that there’s no link between immunizations and autism, anti-vaxxers remain unmoved.

(They can now count on their side—sort of—Donald Trump, who has said that, although he and his wife had their son, Barron, vaccinated, they refused to do so on the timetable recommended by pediatricians.)


The Gormans, too, argue that ways of thinking that now seem self-destructive must at some point have been adaptive.

And they, too, dedicate many pages to confirmation bias, which, they claim, has a physiological component.

They cite research suggesting that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs.

“It feels good to ‘stick to our guns’ even if we are wrong,” they observe.


The Gormans don’t just want to catalogue the ways we go wrong; they want to correct for them.

There must be some way, they maintain, to convince people that vaccines are good for kids, and handguns are dangerous.

(Another widespread but statistically insupportable belief they’d like to discredit is that owning a gun makes you safer.)

But here they encounter the very problems they have enumerated.

Providing people with accurate information doesn’t seem to help; they simply discount it.

Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science.

“The challenge that remains,” they write toward the end of their book, “is to figure out how to address the tendencies that lead to false scientific belief.”


“The Enigma of Reason,” “The Knowledge Illusion,” and “Denying to the Grave” were all written before the November election.

And yet they anticipate Kellyanne Conway and the rise of “alternative facts.”

These days, it can feel as if the entire country has been given over to a vast psychological experiment being run either by no one or by Steve Bannon.

Rational agents would be able to think their way to a solution.

But, on this matter, the literature is not reassuring.

Elizabeth Kolbert has been a staff writer at The New Yorker since 1999.

She won the 2015 Pulitzer Prize for general nonfiction for “The Sixth Extinction: ...istory.”







[Edited 6/11/17 11:03am]

Thank you, quite interesting i read it through and through. Ufortunately we play by the emotions not by the reason all the time, in all fields in life.

  - E-mail - orgNote - Report post to moderator
Reply #246 posted 06/11/17 4:00pm

purplepoppy

Thanks Queenie. I like Kolbert's writing. She lives in Yorkville, Manhattan and sometimes writes about it.

I guess those studies show why we are all laying around the cave trying to one up each other on the org, lol. cool

Brand new boogie without the hero.
  - E-mail - orgNote - Report post to moderator
Reply #247 posted 06/11/17 6:21pm

214

According to this article we know nothing at all, and Trump and supportes know less.

  - E-mail - orgNote - Report post to moderator
Reply #248 posted 06/12/17 5:52am

OnlyNDaUsa

avatar

QueenofCardboard said:


OnlyNDaUsa said:



I will never change going my own way and coming up with my own opinions and will not start getting my views fed to me by anyone.

I will never change telling the truth

I will never change pointing out the double standards

I will never change being willing to call you and others out for making up or repeating lies about me...


Hi ONLY wave

Why Facts Don’t Change Our Minds

New discoveries about the human mind show the limitations of reason.

(wall of unread psychobabble deleted )

The problem with all that is 2 fold: this Elizabeth Kolbert may well be the victim of this same issue... if the argument is despite new facts some people do not change their minds. But at least she did not seem to confuse something being a fact with something being true.

The second issue is that you QueenofCardboard (as well as Purplepoppy and 214) may also be victims of this. If you assume what you believe to be facts are true and what you assume I believe to be facts are not true then you would fall for the same thing.

So when you want to use that against me as some big "Look what someone else said" as if it MUST apply to me and not just as easily to yourself then you are the one to whom is really applies.

See, I have no problem admitting that many things I believe, even some of my deepest and most personal beliefs may well be (and some are very likely are) untrue. I can see, in many cases, why someone may believe or think differently than I do and in most cases, I am fine with that and I am aware that in any case of P or Not P one must be and the other can not be.

Anyway, I have no interest in what you think Elizabeth Kolbert meant or how you believe it applies to me or how you assume (incorrectly) how I think.

"I was raped by the Arkansas AG who then becomes Governor & President..." Juanita Broaddrick
  - E-mail - orgNote - Report post to moderator
Reply #249 posted 06/12/17 5:58am

OnlyNDaUsa

avatar

214 said:

According to this article we know nothing at all, and Trump and supportes know less.

see what you did there, you apply whatever that mess in such a manner as to fall prey to the same kind of mistakes.

And she seemed to have uses (Or quotes those have) the term "Knowledge informally.

Whatever, what little I read is a mess.

"I was raped by the Arkansas AG who then becomes Governor & President..." Juanita Broaddrick
  - E-mail - orgNote - Report post to moderator
Reply #250 posted 06/12/17 6:38am

purplepoppy

OnlyNDaUsa said:

QueenofCardboard said:


Hi ONLY wave

Why Facts Don’t Change Our Minds

New discoveries about the human mind show the limitations of reason.

(wall of unread psychobabble deleted )

The problem with all that is 2 fold: this Elizabeth Kolbert may well be the victim of this same issue... if the argument is despite new facts some people do not change their minds. But at least she did not seem to confuse something being a fact with something being true.

The second issue is that you QueenofCardboard (as well as Purplepoppy and 214) may also be victims of this. If you assume what you believe to be facts are true and what you assume I believe to be facts are not true then you would fall for the same thing.

So when you want to use that against me as some big "Look what someone else said" as if it MUST apply to me and not just as easily to yourself then you are the one to whom is really applies.

See, I have no problem admitting that many things I believe, even some of my deepest and most personal beliefs may well be (and some are very likely are) untrue. I can see, in many cases, why someone may believe or think differently than I do and in most cases, I am fine with that and I am aware that in any case of P or Not P one must be and the other can not be.

Anyway, I have no interest in what you think Elizabeth Kolbert meant or how you believe it applies to me or how you assume (incorrectly) how I think.

Haha. The problem with all that is: this. Why wax poetic on something you did not read? The other thing you don't get is that no one is really "against" you. You go to that for dramatic effect and to make sure the topic stays on Only.

Brand new boogie without the hero.
  - E-mail - orgNote - Report post to moderator
Reply #251 posted 06/12/17 7:44am

OnlyNDaUsa

avatar

purplepoppy said:

Haha. The problem with all that is: this. Why wax poetic on something you did not read?

I am not intrested in that ladies POV I want your POV


The other thing you don't get is that no one is really "against" you.

I am not sure I have ever really thought that you are. I think what I have done is point out times when other have said things to or about me that they meant to prove something or whatever... like i said something about someone being insulting. I did not say I was insulted... just that person meant to be.


You go to that for dramatic effect

Well so then it works!

and to make sure the topic stays on Only.


Who else is more intresting that Me! ?



"I was raped by the Arkansas AG who then becomes Governor & President..." Juanita Broaddrick
  - E-mail - orgNote - Report post to moderator
Reply #252 posted 06/12/17 7:46am

2freaky4church
1

avatar

Only, I will give you the banal award.

DJ is da man
"2freaky is very down." 2Elijah.
"2freaky convinced me to join Antifa: OnlyNDA
  - E-mail - orgNote - Report post to moderator
Reply #253 posted 06/12/17 7:47am

OnlyNDaUsa

avatar

2freaky4church1 said:

Only, I will give you the banal award.

that sounds hot!

"I was raped by the Arkansas AG who then becomes Governor & President..." Juanita Broaddrick
  - E-mail - orgNote - Report post to moderator
Reply #254 posted 06/12/17 8:03am

QueenofCardboa
rd

avatar

OnlyNDaUsa said:

QueenofCardboard said:


Hi ONLY wave

Why Facts Don’t Change Our Minds

New discoveries about the human mind show the limitations of reason.

(wall of unread psychobabble deleted )

The problem with all that is 2 fold: this Elizabeth Kolbert may well be the victim of this same issue... if the argument is despite new facts some people do not change their minds. But at least she did not seem to confuse something being a fact with something being true.

The second issue is that you QueenofCardboard (as well as Purplepoppy and 214) may also be victims of this. If you assume what you believe to be facts are true and what you assume I believe to be facts are not true then you would fall for the same thing.

So when you want to use that against me as some big "Look what someone else said" as if it MUST apply to me and not just as easily to yourself then you are the one to whom is really applies.

See, I have no problem admitting that many things I believe, even some of my deepest and most personal beliefs may well be (and some are very likely are) untrue. I can see, in many cases, why someone may believe or think differently than I do and in most cases, I am fine with that and I am aware that in any case of P or Not P one must be and the other can not be.

Anyway, I have no interest in what you think Elizabeth Kolbert meant or how you believe it applies to me or how you assume (incorrectly) how I think.



ONLY,

You know I love you. kotc

And that I am not a piranha fish that is joining up with other piranha fish to destroy you.

I am disappointed that you did not read the piece because it contained lots of information about how our beliefs are started and why we hold so tightly to them after they are formed.

It also give insight into why threads like these and forums like these, never change anyone's mind.

In fact these types of forums serve to make people more intransigent in their beliefs.

I posted the piece because I thought it was relevant to everyone on this thread, me included, even though it looks like I aimed it straight at you.

There are very few things that you and I see eye to eye about, but that doesn't change the fact that we are all in this together.

Let's try to imagine what it is like to walk a mile in the other persons shoes before we dismiss their point of view.

peace hug flag







"I could stand in the middle of 5th Avenue and shoot somebody and I wouldn't lose voters," Donald Trump
  - E-mail - orgNote - Report post to moderator
Reply #255 posted 06/12/17 8:10am

OnlyNDaUsa

avatar

QueenofCardboard said:



ONLY,

You know I love you. kotc

WooT


And that I am not a piranha fish that is joining up with other piranha fish to destroy you.


But I am into that kind of thing

I am disappointed that you did not read the piece because it contained lots of information about how our beliefs are started and why we hold so tightly to them after they are formed.

Ok i will read it later...

It also give insight into why threads like these and forums like these, never change anyone's mind.


yeah I gathered that

In fact these types of forums serve to make people more intransigent in their beliefs.

It is hard sometimes for me to let go but i have said about trump that if there is evidence than I will accept that... and i have not ever been a big trump fan


I posted the piece because I thought it was relevant to everyone on this thread, me included, even though it looks like I aimed it straight at you.

I know it is okay and i will read it... I am however skeptical of the writer's honesty

There are very few things that you and I see eye to eye about, but that doesn't change the fact that we are all in this together.

Yep and we agree more than we may think we may just not agree oh how to approach the issues

Let's try to imagine what it is like to walk a mile in the other persons shoes before we dismiss their point of view.

If you walk a mile in my shoes your feet would stink if i walk a mile in yours you would not want thm back

peace hug flag


woot!







"I was raped by the Arkansas AG who then becomes Governor & President..." Juanita Broaddrick
  - E-mail - orgNote - Report post to moderator
Reply #256 posted 06/12/17 8:21am

purplepoppy

And another topic dies a mundane death...

Brand new boogie without the hero.
  - E-mail - orgNote - Report post to moderator
Reply #257 posted 06/12/17 8:36am

OnlyNDaUsa

avatar

purplepoppy said:

And another topic dies a mundane death...

well, it was always just wishful thinking... it was all about if we make enough stuff up maybe we can get one that is true!

Here on the org (if memory serves),someone made a topic about it not being too early to talk impeachment! (or inpeachment as coined by a funny as heck parody twitter account of Maxine Waters) before the results had even been certified. And several petitions to get him out were made... that is not how it works!

When the DNC was hacked it was made up that it must be Russia! And from there it was just made up based on the thinnest of reasons that Trump must have worked with Putin to steal the election.

Comey's testimony did more to implicate himself and Lynch in wrongdoing and misconduct that it did Trump.


So yeah it was always going to die a mundane death! The TRUTH has a funny way of being kind of mundane.

"I was raped by the Arkansas AG who then becomes Governor & President..." Juanita Broaddrick
  - E-mail - orgNote - Report post to moderator
Reply #258 posted 06/12/17 9:33am

djThunderfunk

avatar

From the frontpage of Prince.org:

"When posting media reports/new release info, please provide a link to the source where you first saw that information. Please credit the original authors for any stories/photos that you post if obtained elsewhere. Please do not quote entire articles."


Some people think the rules don't apply to them...


We were HERE, where were you?

4 those that knew the number and didn't call... fk all y'all!
  - E-mail - orgNote - Report post to moderator
Reply #259 posted 06/12/17 9:45am

OnlyNDaUsa

avatar

djThunderfunk said:

From the frontpage of Prince.org:

"When posting media reports/new release info, please provide a link to the source where you first saw that information. Please credit the original authors for any stories/photos that you post if obtained elsewhere. Please do not quote entire articles."


Some people think the rules don't apply to them...


yeah i have seen people post like 99% of something and edit out a tiny part and say (Edited for compliance) when really something like 10% is all that is legally allowed.


That being said many sites want it shared. many have code in the text that had the link built in (which can much up some boards) So I do not really care too much.


My issue is (in many but not THIS Case) why would I take or accept someone else's opinion posted by someone that is trying to push or prove their own points. I am like why not argue your own points? Oh you have a link that you cherry picked that backs you up and is not on its face consistent when what i said? and in their mind, that PROVES me to be wrong? (and I am not even talking about sites that are uber (or is it lyft) biased!)

"I was raped by the Arkansas AG who then becomes Governor & President..." Juanita Broaddrick
  - E-mail - orgNote - Report post to moderator
Reply #260 posted 06/12/17 9:53am

OnlyNDaUsa

avatar

the more I read that post the more my assumptions are confirmed. I am struck by the irony as the more we learn about all this Russia stuff the more we should agree that there seems to be not much to these allegations. But some seem to think that what Comey said makes it more likely. I was told (Maybe not here) that when I said it sure seems that Comey said he did not believe Trump broke any laws (and that Hillary did last year) I was accused of making up things he never said.


So again as I said I think the posted article is really more about the impeachers than those that keep asking for evidence.

"I was raped by the Arkansas AG who then becomes Governor & President..." Juanita Broaddrick
  - E-mail - orgNote - Report post to moderator
Reply #261 posted 06/12/17 9:57am

13cjk13

LOCK HIM UP!

Matthew 5:38-39
“You have heard that it was said, ‘An eye for an eye and a tooth for a tooth.’ But I say to you, Do not resist the one who is evil. But if anyone slaps you on the right cheek, turn to him the other also.
  - E-mail - orgNote - Report post to moderator
Reply #262 posted 06/12/17 9:59am

OnlyNDaUsa

avatar

13cjk13 said:

LOCK HIM UP!

Make up a reason if need be!

"I was raped by the Arkansas AG who then becomes Governor & President..." Juanita Broaddrick
  - E-mail - orgNote - Report post to moderator
Reply #263 posted 06/16/17 3:47pm

214

QueenofCardboard said:

OnlyNDaUsa said:

The problem with all that is 2 fold: this Elizabeth Kolbert may well be the victim of this same issue... if the argument is despite new facts some people do not change their minds. But at least she did not seem to confuse something being a fact with something being true.

The second issue is that you QueenofCardboard (as well as Purplepoppy and 214) may also be victims of this. If you assume what you believe to be facts are true and what you assume I believe to be facts are not true then you would fall for the same thing.

So when you want to use that against me as some big "Look what someone else said" as if it MUST apply to me and not just as easily to yourself then you are the one to whom is really applies.

See, I have no problem admitting that many things I believe, even some of my deepest and most personal beliefs may well be (and some are very likely are) untrue. I can see, in many cases, why someone may believe or think differently than I do and in most cases, I am fine with that and I am aware that in any case of P or Not P one must be and the other can not be.

Anyway, I have no interest in what you think Elizabeth Kolbert meant or how you believe it applies to me or how you assume (incorrectly) how I think.



ONLY,

You know I love you. kotc

And that I am not a piranha fish that is joining up with other piranha fish to destroy you.

I am disappointed that you did not read the piece because it contained lots of information about how our beliefs are started and why we hold so tightly to them after they are formed.

It also give insight into why threads like these and forums like these, never change anyone's mind.

In fact these types of forums serve to make people more intransigent in their beliefs.
I posted the piece because I thought it was relevant to everyone on this thread, m
e included, even though it looks like I aimed it straight at you.

There are very few things that you and I see eye to eye about, but that doesn't change the fact that we are all in this together.

Let's try to imagine what it is like to walk a mile in the other persons shoes before we dismiss their point of view.

peace hug flag







And it was, at least for me. Thank you.

  - E-mail - orgNote - Report post to moderator
Reply #264 posted 06/16/17 7:54pm

13cjk13

OnlyNDaUsa said:

13cjk13 said:

LOCK HIM UP!

Make up a reason if need be!

Pick one!! He's the most corrupt pig this country has ever seen. Also, the right wing taught me that you don't have to actually have a reason to lock people up, you just have to hate them and scream it real loud while you're waving a flag and holding a gun and a bible!!

Matthew 5:38-39
“You have heard that it was said, ‘An eye for an eye and a tooth for a tooth.’ But I say to you, Do not resist the one who is evil. But if anyone slaps you on the right cheek, turn to him the other also.
  - E-mail - orgNote - Report post to moderator
Reply #265 posted 06/17/17 11:16am

djThunderfunk

avatar

13cjk13 said:

Pick one!! He's the most corrupt pig this country has ever seen. Also, the right wing taught me that you don't have to actually have a reason to lock people up, you just have to hate them and scream it real loud while you're waving a flag and holding a gun and a bible!!


Does that kind of thing still happen where you live? That's crazy, I'd move.

We were HERE, where were you?

4 those that knew the number and didn't call... fk all y'all!
  - E-mail - orgNote - Report post to moderator
Reply #266 posted 06/19/17 10:55am

OnlyNDaUsa

avatar

13cjk13 said:

OnlyNDaUsa said:

Make up a reason if need be!

Pick one!! He's the most corrupt pig this country has ever seen.

and the evidence for that is what?


Also, the right wing taught me that you don't have to actually have a reason to lock people up, you just have to hate them and scream it real loud while you're waving a flag and holding a gun and a bible!!


Recent Example? (but it is funny you insult the country, relgion, and civil rights in one post)

"I was raped by the Arkansas AG who then becomes Governor & President..." Juanita Broaddrick
  - E-mail - orgNote - Report post to moderator
Reply #267 posted 06/20/17 7:39am

TRUECRISTIAN

gradual development of Trump character, chronicling his tremendous capacity for evolution and growth, thus illustrating what made it possible for a man so inexperienced and so unprepared for the presidency to become a great moral leader. In the most troubled of times, here was a man who led the country.
  - E-mail - orgNote - Report post to moderator
Reply #268 posted 06/20/17 7:45am

TRUECRISTIAN

Love this guy!! What a mind and a perfect leadership capability to handle the United States especially at such a time in history when this country was so torn apart and so close to internal combustion. Honestly, I don't think that there was another person who could have guided us through this mess that we had created for ourselves.
[Edited 6/20/17 7:45am]
  - E-mail - orgNote - Report post to moderator
Reply #269 posted 06/20/17 7:54am

13cjk13

TRUECRISTIAN said:

gradual development of Trump character, chronicling his tremendous capacity for evolution and growth, thus illustrating what made it possible for a man so inexperienced and so unprepared for the presidency to become a great moral leader. In the most troubled of times, here was a man who led the country.

His moral character really does warm a person's heart! Makes you want to grab a woman by her pussy and find a disabled person and make fun of them for days!! Or maybe even go out and buy a mail order hooker from Slovenia!!

Matthew 5:38-39
“You have heard that it was said, ‘An eye for an eye and a tooth for a tooth.’ But I say to you, Do not resist the one who is evil. But if anyone slaps you on the right cheek, turn to him the other also.
  - E-mail - orgNote - Report post to moderator
Page 9 of 15 « First<5678910111213>Last »
  New topic   Printable     (Log in to 'subscribe' to this topic)
« Previous topic  Next topic »
Forums > Politics & Religion > The impeachment of Donald Trump: how likely is it?