Posts Tagged ‘Education’

Interacting With Creationists and Other Pseudoscientists, Part 2: Treating the Problem

November 13, 2009

So, in Part 1 to this two-part series, I analyzed the phenomenon of pseudoscience not from an intellectual perspective, but from a pathological one. If you haven’t read it, go back and do that now because you’ll probably be lost on this article if you don’t.

Earlier, this article had identified pseudoscience as not an intellectual ailment, but a social and emotional one. As such, I would advocate a socially and emotionally-oriented treatment: A psychotherapeutic approach specifically designed to treat pseudoscience.

Please mind that this method is likely to share all the problems with psychotherapy as a whole, including, most notably, inconsistency in treatment effectiveness due to the strong interpersonal aspect of the treatment. I hope to outline the method clearly enough in this article to eliminate inconsistency caused by the application of different methods, but I’m sure that just about anyone who actually bothers to apply the method would add their own little spin on it in no time – not that that’s necessarily a bad thing. Furthermore, this method is unlikely to be able to address the problems with those who originate pseudoscientific communities – just people indoctrinated into them.

Yet, on the other hand, I don’t think there’s much else we can do. Pseudoscience involves the malfunction of some of our most fundamental functions as human beings, and applying, say, drugs to fix the problem is likely to cause way more problems than is worth.

The first thing to address is mental preparation – to ensure that you have all the cognitive tools required to try to cure a pseudoscientist.

  • Understanding: A strong layman’s understanding of the topic in question. You don’t need a specialist’s understanding, but you do need to grasp the subject in general terms.
  • Google-fu: Be able to quickly scour and parse the internet’s vast data repository, with a particular eye towards Google Scholar, a database of scientific papers. Awareness of a wide variety of specialized sources (such as an online bible, or TVTropes) is a plus. This point is why you don’t need a specialist’s understanding about anything – the internet can provide.
  • Empathy: Understand that you are interfacing with a real person, with a real problem, and who needs your help to overcome that problem.
  • Perceptiveness: Be able to gauge the emotional and social subtext behind a conversation, not just the intellectual content of that conversation.
  • Patience: Understand that you may need to explain things which you find to be very, very obvious. Do so in clear terms and without condescension, and reiterate when necessary.
  • Articulation: Be able to describe even fundamental ideas in a way that they can be understood by an individual who does not know them, so that you can convey them. Also, be able to rephrase statements in the form of (possibly rhetorical) questions.
  • A grasp of the scientific method on a philosophical level: You need to be able to describe, in plain-language terms, not only how the scientific method functions, but why it functions, and what it accomplishes in general.

All the above skills, in aggregate, will be referred to as your “toolbox”. The importance of each of the tools in said toolbox will be made clear below.

Now, to the method itself.

Recall the core pathology of pseudoscience: extreme emotional investment in a group. This subverts the individual’s intellectual functions towards protecting what they feel as the group’s doctrine against outsiders, rather than analyzing it objectively, and presents a block insurmountable by logical argumentation alone. The first phase of this method must be to remove or bypass that block.

Fortunately, the pathological analysis of the function provides insight as to key points that we can use to disable this self-reinforcing behavior:

  • Pathological pseudoscience is triggered by social conflict with a member of another group. Thus, therapy should include taking actions that let the patient see you as an individual, rather than as a member of an ‘enemy’ social group. I feel that this point builds the strong interpersonal relationship necessary to treat the condition, and as such I feel it to be a necessary part of the approach.
  • Pathological pseudoscience requires strong identification with a group. Thus, therapy could include taking actions that ease the emotional intensity of group association. Be careful with this point, as I imagine it’s very easy when taking this route to appear as an enemy (trying to ‘trick’ the patient into betraying his group), and thus carries some risk of triggering the condition.
  • Pathological pseudoscience is directed to the protection of what the patient believes is the group’s position. Thus, therapy could include taking actions that allow the patient to disassociate information with group membership, allowing him to analyze that information without triggering the condition.

At this stage of therapy, approach without hostility or aggressiveness – even intellectual aggressiveness if necessary (that is to say, don’t bother making any arguments, as they’ll just trigger the condition). Establish a personal rapport with the patient, preferably through discussion of an unrelated subject which the patient does not suffer pseudoscientific symptoms with. Optimally, demonstrate superior comprehension of one or more subjects on which the subject can be corrected, in order to instill in the subject respect for your opinions.

It should be noted that this approach should work optimally in a 1 on 1 scenario: If there is any quantity of individuals observing the exchange, you significantly increase the chances of the condition triggering if the patient has reason to believe the observers are at all relevant to the pseudoscientific affliction (either agreeing or actively disagreeing, and thus either part of the in-group or ‘enemies’ of that group). For that reason, I feel the effectiveness of the therapy would rapidly degrade as the number of observers increase.

Broach the afflicted subject or subjects with care, do not aggressively pursue afflicted subjects for discussion, and do not use statements of fact to correct the patient’s errors at this stage. Instead, rephrase your corrections into legitimate questions and posit them as if you’re genuinely curious as to the answer, ensuring to describe the logic that leads you to ask the question. While a statement correcting the patient strongly risks triggering the condition, a question the patient feels to be asked legitimately encourages a legitimate response, and reduces the chances of triggering the condition. This could allow you to make progress with the patient despite the condition.

For a small subset of patients, I imagine this could be sufficient – in particular, individuals already familiar with the scientific method and knowledgeable about the ‘enemy’ positions could draw their own conclusions once they have reached a point sufficient to become able to openly question the pseudoscience.

However, many more will need assistance in questioning the positions of their in-group. Thus, the second phase of the method is to instill in the patient the tools they need to effectively evaluate ideas.

Transitioning into this phase is unlikely to be an exact science, so I suggest that during this phase the first-phase measures are still kept in mind and adhered to when reasonable. As such, the second phase is more a progression stemming from the first rather than an entirely new approach.

Essentially, the second phase involves trying to teach the patient such that they understand, can follow, and want to follow the scientific method in order to better understand ideas. You never need to say what it is you’re trying to do – you should never even need to say the words ‘scientific method’ (in fact, doing so, particularly with individuals who have been exposed to people trying to correct them through argumentation, might even trigger the condition). Also importantly, don’t impose or push the information on the patient. Give them as much as they want, when they ask, and if you have sufficient control of the conversation you can try to work them into a position where they will ask, but do not teach without establishing that the patient is ready to try to learn.

Mind, when you teach, to focus on the three things we’re trying to get the patient to accomplish:

  • Understanding: The scientific method, at its’ core, is simply a series of techniques that people use in order to better evaluate information – so that they can know what is correct and what isn’t. Conveying this to some individuals may be exceedingly difficult, and may require extensive explanation at a very basic level. Be prepared to use simple examples in order to convey scientific methodological concepts, and try to build upon things you’ve already established whenever possible.
  • Capability: Ensure that the patient has understood the methods you’re teaching by encouraging the patient to apply them. Upon correct application, I recommend praising the patient to whatever degree you feel you can without sounding patronizing (patronization could be interpreted as an attack on a social or emotional level, and trigger the condition).
  • Inclination: This is the most important part – in fact, many patients may be aware of scientific methodology and simply not grasp the value of using it. This is why you need to be able to describe not just the how, but the why of science. Science is a way to understand and know more comprehensively, and knowledge is power. Science offers a potential solution to almost any conceivable problem, and you may even have an opportunity to use your previously established personal rapport with the patient in order to make a personal connection between the patient’s problems, and the ability of the scientific method to solve them.

When the patient openly, and completely without prompting, seems to be applying scientific methodology to the pseudoscientific affliction, this signals that it is time to move into the third and final phase.

The final phase is to hold a calm, logical discussion with the patient that directly and thoroughly addresses the topic. Mind that the subject could be new to this kind of thinking and be prepared to help walk them through the logical process. Have information resources onhand and, if the patient questions them, or brings up anything that you can not readily answer, work with the patient to find out the answers or evaluate any questionable sources.

Man, this took forever to write! Anyway, that’s my proposed treatment for any and all forms of pseudoscience. I hope people who read it find it insightful and helpful, and if it leads to anyone being cured, all the better.

As for myself, having written this article has gotten me thinking about normal human thought – pseudoscience is a dysfunction, but it’s a dysfunction stemming from a series of perfectly normal features of human cognition. What insights could such an understanding provide about the thoughts of people in general? Also, having proposed a rather extensive hypothesis regarding interaction between human socialization and human intelligence, I should probably think up experimental scenarios that could be used to verify aspects of that hypothesis.

Advertisements

Interacting With Creationists and Other Pseudoscientists, Part 1: Identifying the Problem

November 13, 2009

I’ve spent a significant amount of my time on the internet arguing with people. Often, with people so hopelessly lost in incredibly bad ideas that my interaction brings with it painfully little chance to drag them out.

But I’m a stubborn individual, and I’ve accumulated a lot of experience in the dusty corners of the internet, and I think I’d like to share my insights regarding interactions with pseudoscientists, such as young-earth Creationists. I’d like to think that insight would be interesting and help people to better correct people with incorrect ideas.

My advice only applies to well-prepared individuals with a degree of background understanding (a layman’s understanding of the topic will do, but you’re likely to need to compensate in other areas), who are interacting with individuals holding ideas which are known by the scientific community to be discredited (young-earth Creationism, global warming denying, homeopathy, etc). This approach will not be particularly applicable if you’re groping in the dark on the topic or if the discussion is a legitimate debate with someone holding a potentially correct position.

For starters, I think most people misdiagnose the problem with pseudoscience and those who have been indoctrinated into it. The common, intuitive, and direct approach to dealing with false information is to enter into a logical argument to demonstrate that the information is false. However, as you can see from the stubborn persistence of pseudoscience in the modern world, this approach often does not work.

The next step to go from there is to analyze why the debate-oriented approach towards correcting pseudoscience has such a low rate of success. Again, the common and intuitive conclusion is to assume the pseudoscientist lacks the ability to comprehend your argument, lacks the integrity to accept it, or is otherwise personally deficient and thus the failure is none of the business of the scientific debater – once the scientific debater presents their argument and it is not properly addressed, they are done, and can wash their hands of the business having said they have exhausted all options for the approach.

I think we need something more oriented towards results than that approach. Something that can allow us to fix pseudoscientists when we encounter them, with a high success rate – a cure for pseudoscience, as it were.

Which leads me to the primary and most critical theme of this article: Pseudoscience is not a mistake made by an individual that must be corrected, but a disease, afflicting a patient, that must be cured. The very large distinctions in approach I will propose compared with a debate-oriented approach stem directly from this paradigm.

The first major difference is that, if we’re to identify pseudoscience as a sickness, we can attempt to describe its’ pathology, or what precisely is malfunctioning in a pseudoscientist, as a first step towards a stronger understanding of the condition.

Under normal function, a human being will:

  • Evaluate (however briefly) information that they perceive in order to gauge its’ accuracy and truthfulness, and then take appropriate measures.
  • Check information for self-consistency if given a reason to do so.

Instead, a pseudoscientist will manifest the following impairments, impairments which constitute a large portion of the set of pseudoscientific behaviors:

  • Exhibit strong confirmation bias towards information relevant to the affliction.
  • Demonstrate extreme cognitive dissonance rather than analyze information relevant to the affliction for self-consistency.

The first major insight we can gain is the observation that pseudoscientists are not necessarily universally impaired in these functions – outside of the pseudoscience-afflicted area, a pseudoscientist can function intellectually without impairment. This implies that these malfunctions are not the primary cause of pseudoscience but either secondary causes or symptoms of the primary cause. In essence, a pseudoscientist is not necessarily of low intelligence, or intellectually impaired in any way other than the affliction of the pseudoscience itself. So we should stop calling them stupid, because they aren’t.

Now, the above describes a large portion of pseudoscientific behavior, but does not  by any means catch all behaviors associated with pseudoscience. Explaining the others is where my pet hypothesis (and proposed cure stemming from this hypothesis) comes into play.

Other behaviors pseudoscientists have a particular propensity for*:

  • A strong “Us vs. Them” mentality: pseudoscientists describe those who disagree with them (frequently actual scientists, or even scientifically literate laymen) in strong oppositional terms that often isn’t relevant to the specific subject matter that they would be discussing. (As an aside, google “us vs them”. I found the unusually politically charged results to be fascinating)
  • Spectacle: Unlike science, ‘no publicity is bad publicity’ for pseudoscience. Pseudoscience promotes aggressive, emotionally charged debate, and actual logical discussion is irrelevant to the approach. Rather, victory or defeat is called (well, victory is called, anyway, more on that momentarily) based on the status of the emotional undercurrent of the discussion, in a manner similar to how children argue with each other. That is to say, when arguing with a pseudoscience advocate, his objective isn’t really to make you think differently – it’s to make you feel differently.
  • Strong group pride: The strong social bond between pseudoscientists of the same flavor extends beyond the attacking of external forces. Pseudoscientists are known for making their own in-groups, including schools (complete with unaccredited degrees), journals (not strictly scientific journals, though, as they tend to lack peer review and other features which make scientific journals scientific), “Think Tanks” for PR purposes, and other socially or professionally flavored clubs. And, indeed, clubs they are, albeit themed by the pseudoscience. Pseudoscientific groups do not promote intellectual interaction, and this implies they do not exist for intellectual reasons. Pseudoscientific groups instead primarily provide social and emotional functions.
    This is further observable when pseudoscientists lose debates, even by the emotionally-charged pseudoscientist standard – the group simply forgets the event has happened, in what seems to be a subconscious act of in-group support. This phenomenon also cripples any attempts pseudoscientists may have to enforce internal intellectual consistency, as people violating that consistency can simply be forgiven without censure or possibly even conscious thought.

Do you see where I’m going with this yet? Pathological pseudoscience seems to be identified by exceptionally strong affiliation with a strongly-defined social group. Mind, here, that when I say ‘strongly-defined’, I’m not referring to any purpose a social group may have in and of itself. On the contrary, I would identify a strongly-defined social group as one that defines the group’s enemies.

Now, we know already that strong group cohesiveness can impair individual thinking. Pathological pseudoscience seems to be a phenomenon related to groupthinking, but I feel it to be much more powerful. Pseudoscientific groups aggressively pursue strong group cohesion, which is likely made stronger still by the perception that the pseudoscientists face a powerful, monolithic enemy – that enemy being, collectively, everyone who thinks they’re wrong. And conflict with those individuals, rather than correcting them, only stands to make the affliction worse, as the more absurd (and thus memorable) the actions an individual is forced to take to defend the group, the stronger their later emotional self-reinforcement towards that group could become.

So, to summarize: I feel that pseudoscience is identified by when an individual reaches a point of emotional investment in a social group so strong, that it inhibits the individual’s ability to think in the sense that we understand the concept. Instead, their cognitive resources are hijacked to support the group paradigm, regardless of how absurd it may be.

Having read it, it all may seem fairly obvious (it does to me, now having written it). Yet, this explanation indicates that the modern approach towards pseudoscience is dangerously flawed. While the debunking, argumentative approach may function to immunize those not afflicted, it does nothing to correct pseudoscience in the afflicted and could conceivably aggravate the condition. Accordingly, recovery rates for pseudoscience tend to be very low. We are, essentially, trying to treat the symptoms of pseudoscience, and in doing so we’re missing the disease.

This post is turning, frankly, very large, so I’m splitting it in two. The second half will go into depth for my proposed treatment method for pseudoscience.

*- For further reading, check out these articles (I used them as a refresher before writing this article):

http://www.stardestroyer.net/Empire/Science/Pseudoscience.html (Yes, it’s a Star Wars website. Yes, the article is good)

http://pandasthumb.org/archives/2009/09/science-non-sci.html

http://initforthegold.blogspot.com/2007/09/pseudoscience-symptoms.html

The Cognitive Power of Belief

May 29, 2009

There are people who believe in very, very many things out there. That medicine can become more powerful the less of it you take, that Atlantis was a super-advanced civilization that interacted with aliens, that the universe is six thousand and change years old, that the leader of a group is in fact a god come to earth, that everything, including abstract objects like rivers and mountains, have souls, that our government is secretly working against us all, that our government isn’t working against us all… and so on. The gamut of human belief is absurdly wide, and the depth of human belief is similarly stunning.

I can’t help but think there’s something to that.

It seems clear from these beliefs that human beings are not rational creatures, that we function using some other fundamental thought method that only incidentally supports rationalism. But what method could that be?

I’m of the personal opinion that what we describe as ‘spiritualism’, ‘mysticism’, ‘religiousity’, and so on (henceforth to be called ‘mysticism’ ’cause I think the word sounds cool) functions as a kind of emotional interface to our conscious, cognitive and linguistic abilities, by functioning as a system of triggers that when activated cause us to learn associated ideas and behaviors quickly and persistently. So persistently, in fact, that we could have difficulty overcoming them consciously even if on a rational level we know they are incorrect.

The existence of the phenomenon of cognitive dissonance in regards to mysticism, in particular, fascinates me. What I imagine, is that what someone learns with strong facilitation by such a ‘mystical’ drive is something well-learned indeed, and unlikely to be overridden except through use of a comparably powerful cognitive tool.

All in all, it seems that mysticism, long used by hucksters and cult leaders to mislead and confuse people, could be used as a tool for good – and considering its’ power, used very potently indeed.

The first question would regard what we could use mysticism for – to that end, I would posit that the collective functions of mysticism serve two purposes for the human mind: To describe the world around us in an intricate and emotionally engaging way (descriptive mysticism), and to proscribe to us correct behaviors on a largely emotional level (proscriptive mysticism).

It seems to me the bigger question, though, is to precisely how to go about using mysticism constructively, a task which additionally entails coming to understand mystical functions in the human mind in greater depth.

I’m musing (pun appreciated, but not intended) on that one.

Thinking About Thinking: Relational Databases and You (You being anybody)

May 25, 2009

Okay, first a bit of background information.

A relational database is a method computers use to organize information (based on the relational model). Information is organized into ‘Tables’ based on their content (each Table represents a single subject of information), and generally each piece of unique information is given a corresponding identifier called a ‘Primary Key’.

A quick example, number of days in a month:
Month        #Days
January        31
February    28 (29)
March        31
April        30
May        31
June        30
July        31
August        31
September    30
October        31
November    30
December    31

This table organizes the months and contains how many days are in each month.

Relational data is awesome. Its’ neat organizational structure allows for quick recall of detailed facts and sophisticated examinations of massive amounts of data – even when the data has to be calculated from scratch*.

Humans don’t naturally think in relational terms (well, not efficiently, and not usually by default, anyway) – but we can learn to do so, either by ourselves or by picking it up from other people.

It’s all about internally organizing the way you think in a couple different ways.

First, group up things you know into specific subjects in which each bit of information has a unique identifier.

To extend the earlier example, the subject there would be “Months of the (modern gregorian calendar) year”. Note all the implied information there is in “Months of the Year” – if you were to learn a different calendar than the modern gregorian calendar, you would have to specify that into two different subjects. Expect to reevaluate how you divide what you know into categories as you learn more about things, and thus you have to draw more distinctions between subjects in order to keep them well-organized in your head.

Now, for this, we need a unique identifier – in the case of our months, the names of the months should do just fine – each month has a different name from all the others (Another unique identifier for months could be the order in which each appears on the calendar). That unique identifier is the subject’s “Key”.

Note that the identifier doesn’t have to be the same as the subject: If you were to memorize each month based on their order rather than their names, the subject would still be months, even though you would think more like, “The first month in the year is named January” instead of “January is the first month in the year.” You could even use both as unique identifiers at the same time (and if you have to deal with the months of the year frequently enough, you probably do).

Second, group up as much information as you can into those subjects.

To extend the months example, you don’t have to just keep track of the months’ names and how many days there are in there. You could make it look more like:

Month        Order in Year    #Days    Season
January        1        31    Spring
February    2        28 (29)    Spring
March        3        31    Spring
April        4        30    Summmer
May        5        31    Summmer
June        6        30    Summmer
July        7        31    Fall
August        8        31    Fall
September    9        30    Fall
October        10        31    Winter
November    11        30    Winter
December    12        31    Winter

Third, well, from there it’s all a matter of how you think.

When you learn new information, restate it to yourself in a matter consistent with how you’re storing information (“January is a Month. January is the first month of the year. January has 31 days. January is in Spring.”)

When you recall information, try to access it by asking yourself questions that access data based on however you stored it. So if someone asks, “What’s the month after March?” and you don’t have that bit of information memorized, you can break it down into: “Which month is March? (Third)” followed by “What’s the fourth month? (April)”

Of course, this might not work for everyone. This is just something I do to organize information I learn, and it doesn’t work out badly. On the other hand, you might already do this sort of thing subconsciously.

Mind also that there are tons of other ways to store and recall information, some of which work better than this sometimes (like a catchy rhyme for learning a one-off fact, for instance). It’s one of the awesome features of the brain that we can employ multiple different setups to be able to store and recall information.

*-Describing in-depth the power of the relational data structure is well beyond the scope of this article – this article’s just about touching upon the ideas involved, not going into full depth on them.

Thinking About Thinking: How to know what’s worth knowing?

May 17, 2009

Knowledge is power. But it’s power that you need knowledge in order to use – being able to evaluate information sources without bias and being able to select sources to obtain further information from are both learned skills.

That’s a really hard skill to learn – it’s time-consuming and chancy, but ultimately, it’s really important. Being able to trust what you know, and being able to learn more things effectively, is vital to be able to do things and get what you want out of life, no matter what that may be (after all, we make all our decisions based on what we know, and if you don’t know what’s going on, well, your decisions could easily be wrong).

So I thought to write down a quick starter’s guide about ‘knowing knowledge’, as it were, something that wouldn’t require too much time and energy but can still get fairly good results.

Anyway:

First, acknowledge the problem: There’s a lot of information out there and you can’t be sure what’s true and what isn’t.

Now, you could investigate and analyze each and every bit of information you get, like some sort of scientist (frankly, not even scientists do that). But let’s be honest, here: Nobody has that sort of time to spend just on going over information. So the trick is to save time and energy while still being able to trust what you know.

Ultimately, it’s all about trust. All information comes from somewhere, and you need to be able to get information from places you can trust. That makes your decisions about who and what you place your trust in, who and what you believe, vital to understanding the world around you.

Because you’re trusting information based on where you get it, you need to remember that all information comes from somewhere. If you have a good friend who relates to you something they heard from someone you don’t trust, should you trust the information just because your friend is saying it?

This ties into something else you should remember: no matter how much you trust someone, they may not be right. This is of importance since you will encounter information that contradicts information you already have, and sometimes this will merit closer inspection.

This inspection is important, since it helps to give strong support behind your decisions to trust or not trust someone or something for information. So ultimately, even though you try to avoid needing to do it all the time, sometimes you have to really research something.

How you research something is important. I recommend something like the scientific method (it can’t be the same, ultimately, since you’re probably already starting with a ‘hypothesis’):

  • Start with the information you have.
  • Forget where the information came from – while you’re testing information, your trust of its’ source doesn’t matter.
  • Ask yourself, “Is this information disagreeing with information from other sources, or from information that I’m creating?” All information comes from somewhere, after all, and you’re just about the most trustworthy source you have of it.
  • See if you can set up a test for the information – something that can allow you to see for yourself if something is true or false. Mind that sometimes tests are only conclusive in one way, and that sometimes multiple tests, each testing the information in a different way, are best.
  • And remember these two things: An argument might mean someone is wrong, but it doesn’t always mean anyone is right. There’s nothing wrong with saying “I don’t know”, judging that you lack the information to say if something is true or false.

If you find that something you learned was wrong, don’t just stop trusting that information source outright. For instance, the information you gained might have been second-hand, and your source conveyed to you the information out of trust for their source. When you find our something you heard was wrong, look into why it was wrong, and how that information made it to you.

When you do find something wrong with information you got directly from a source, it might be a good idea to look into other information from that source as well. Remember, it’s all about evaluating how much trust your information sources deserve.

Now, something to keep in mind: Try to have multiple different sources of information. And remember, all information comes from somewhere – having two sources of information that both come from the same place does not count!

The reason you need multiple sources of information is because you can only tell what information needs to be double-checked when it disagrees with something else. So you might have to go looking for information that disagrees with what you know.

And a last bit of actual, practical advice: Don’t make decisions based on information you only got from one source, unless you really trust it.

That’s it. I’d consider that a ‘beta’ version, there’s no doubt extensive improvement capability in readability and probably the method itself. I just wanted to get down what I thought up at the time.