Wednesday, April 29, 2015

An Unusual Proof

I have argued in many essays* that the free market, far from being a force for "warmongering capitalists" or "exploitative, imperialist entrepreneurs", is actually a force for peace, that those engaged in free and open trade have every incentive to maintain a peaceful relationship with the rest of the world**. Obviously, this is not universal, even in a fully laissez-faire economy, there will be individuals who choose a violent course, or those who misunderstand the best path to fulfilling their own interests and think theft or violence provide a valid shortcut, but, overall, the incentives offered by trade tend to encourage pacific behaviors***.

The reason I bring this up again is that recent events brought this to mind.

While watching the riots in Baltimore, I started thinking about how not only the riots, but the subsequent curfew, must be hurting the restaurant and bar/nightclub trade in the City, which then led me to think of what other businesses must be suffering. Interestingly, the first that came to mind was drug dealing. Then again, perhaps that is not so strange, since Baltimore is notorious for its "open air drug markets", and the open secret that the police, though they will periodically sweep a few drug corners to make the news, generally don't do much to curb these low level drug merchants. What struck me in particular was, that, while many of their peers might be looting, the drug dealers themselves were likely to view the riots more like the shopkeepers and other honest citizens, since the riots not only brought unwanted police attention, but also kept out many of their customers, losing them considerable amounts of money.

Thus, it was not much of a surprise when I heard news stories of various gangs banding together to keep order. I know, many conservative view these claims with skepticism, but to me it makes perfect sense. After all, the gangs make their money by drug dealing. Every day the riots continue, they are losing a small fortune, as well as losing the loyalty of those who relied upon drug money for a living. In short, because they rely upon buying and selling, the gangs too want a peaceful city (for the most part), in order to keep their income flowing.

And if there is any better endorsement of the pacifying ability of the free market, I can't think of it. That is can take several hostile gangs and make them come together as supporters of law and order seems to me the best proof of the way the free market encourages peaceful coexistence.


* See "An End to War", "The Road to Violence" , "The War of All Against All", "Greed Versus Evil", and "Competition" among others.

** In contrast, even small interventions, such as protectionist measures, or currency manipulation, tends to create hostility between nations, while inflation, government patronage and privileges create hostiliuty within a nation. See "Inflation and Uncertainty", "Perverting Self Interest", "Free Trade, Employment, Outsourcing, and Protectionism", "Cheap Lighters, Overseas Dumping and Monopolies", "Computer Games, Immigration and Protectionism", "Government Funding and the Creation of Strife", "Missionary Zeal and Human Discord", "How the Government Corrupts Relationships", "Hard Cases Make Bad Law", "The Consequences of Bad Laws" and "Why Freedom Is Essential".

*** An interesting case study could be made of the medieval and renaissance trading cities of Italy (Venice, Genoa, Amalfi, Pisa, etc). In larger terms, they fits this pattern quite well, generally opposing wars, even initially opposing the crusades, as they would disrupt trade. On the other hand, on a smaller scale, they often warred with one another, but mostly due to the protectionist policies pursued by the cities, and the tendency of medieval rulers to grant monopolies to single trading partners. Stay tuned for a future essay on this topic.



Some will doubtless point to the gang's normally violent interactions as a counter-argument, but I would disagree. The violence within the drug market exists because it is not entirely a free market. To be a free market, it would have to have legal mechanisms for settling disputes and be recognized as a legal venture. Because drugs are illegal, disputes must be settled violently, and market competition is done, not only through price wars, but also with bullets. The same was once true of alcohol sellers, but, once alcohol was legal once again, the market lost its violent character. Thus, the violence of drug dealers is not an argument against my thesis, but actually supports it as well.

Sunday, April 26, 2015

A Few Thoughts on Charter Schools

I recently read a few articles that mentioned mismanaged charter schools, one that I recall specifically being one in Florida based on Scientology "study tech" which did worse on the FCAT than a school for students at risk for dropout, and reading the subsequent comments it seems the failure of these charter schools is inevitably used as an argument against privatization1. It seems as inevitable as every warm day being used as proof of global warming2, that every failure of a charter school is seen as proof of the foolishness of privatization. Of course, being as strongly in favor of eliminating all public education as I am3 -- along with all other aspects of government beyond protecting life, liberty and property4 -- I feel I must rebut these arguments, although in the process I must also show why (1) charter schools are not the same as true privatization, (2) such "privatization" schemes as vouchers and charter schools, or the nominal privatization of social security5, can create new problems and (3) that true privatization would be quite beneficial.

I suppose first, to be completely accurate, I must point out the factual problems with both sides of the privatization argument. The conservative love affair with charter schools, as well as the school board fascination with them, came about because a number of charter schools took funding far inferior to that given to ordinary public schools and produced miraculous results. In many cases, working with the normal cross section of local students and relatively meager funding, they produced results comparable to private schools with more selective admission standards6. There are many ways in which this was accomplished, and even the experts do not agree what allowed the success stories to take place -- flexible content, giving teachers more freedom, parental involvement, reduced union involvement, reduced school board interference, smaller classes, and so on -- nor does it matter for our purposes. What matters is that there were numerous very successful charter schools, and many continue to be successful. And so it is nonsensical to argue "privatization" fails in education, at least on the grounds of some bad charter schools, as a number of others continue to succeed.

But that leads to the other side of the argument, that being that it is equally foolish to argue that charter schools are uniformly successful, or that privatization is a panacea. As we have seen from the articles I read, and the arguments against privatization based on them, privatization/charter schools can lead to disastrous results as easily as wonderful ones. Once the state has thrown open the faucet of government funds, it seems almost inevitable that many will seek to drink from it who have nothing to offer, who can provide nothing but empty promises or fraudulent schemes. And so, just as I argued in the case of vouchers7, simply funding any and all comers is a recipe for disaster. Even establishing some sort of loose guidelines, as was the case with most charter school programs, it is simply impossible to create a program in such a way that it reliably excludes all potential frauds,  as well as incompetent or ineffective schools.

So, why do charter schools succeed, when they succeed? And why do they fail when they fail? And why do I argue that neither outcome tells us much about privatization? Or, to put it in more practical terms, why do I argue that charter schools cannot show us a true path to public education reform? In short, why can we have success in some charter schools, yet find it impossible to replicate those results system-wide?

Actually, two of those questions can be answered at once, as they relate to one another. The reason that charter schools sometimes -- perhaps in the majority of cases -- succeed, is explained by the same mechanism that tells us why it can never become a universal solution. Charter schools enjoy, in a massively bureaucratic and standardized system, the status of being an exception. It is the same reason that "magnet schools" in many school systems enjoy similar success, or why a few traditionally rigorous and prestigious schools in certain districts continue to thrive8. In both cases, charter or magnet schools, because the schools were considered experiments, exceptions, they were allowed to deviate from the rigid formula forced upon the rest of the public school system.

In some cases, this included the ability to select the best and brightest students, which clearly gave such programs an edge. In others, entry was open to any who sought it, but such a system still self-selects for more ambitious students, or more involved parents with higher aspirations for their children. But even when enrollment was not so biased toward those with various academic advantages, where students were assigned by lottery, or based upon their residence, there was still a tendency for many of the charter schools to better their more mundane public school peers in student performance. And that is attributable -- as are, ironically, the somewhat less numerous charter school failures -- to nothing but their status as an exception.

To begin, let us look at one fact, which is rarely stated, and runs contrary to almost all of the assumptions behind public school curriculum. That is the simple fact that education is not a precise, formulaic process. It cannot be carried out by rote while following a rigid schedule of reading from a fixed script. Were that the case, anyone could become an excellent teacher by simply finding a script written by an expert, and schools could hire anyone to teach once they found that successful script for each subject. Similarly, were education a mechanistic, deterministic process, then, excepting the variations due to the quality of students, every class which followed the same lesson plan would produce the same outcomes. In short, education would be a predictable process producing easily predetermined results9.

Unfortunately, such a belief in deterministic education lies at the heart of public education.

That is not so say that anyone in the system truly believes it. The administrators, school board, principals, curriculum experts, teachers, no one really thinks that a predetermined script, rigidly followed by each teacher, will produce identical results. (Or, perhaps, to be more accurate, I should say very few truly believe it, as I am sure there are a few true believers out there.) But as I have said before, schools are run on bureaucratic rules10, and for a bureaucrat the primary consideration, in all things, is to avoid anything which might produce negative attention11. The moment one group gets something another does not, there is groups for complaint, for protests, for lawsuits, and that is the greatest fear of most low and mid level bureaucrats. And thus, above all else, the schools must ensure that education is uniform, that across whatever tier they control12, all students receive as uniform an experience as possible.

We can see this most notably in those areas where the schools actually allow for some nominal variation. For example, in classes for the gifted, or advanced placement, or whatever other special "accelerated" classes exist. In a few cases, especially when such programs are new, or limited to a very small pool of students, they tend to work the same way charter schools do, they accept the best students, attract the teachers most inclined to innovation, and follow flexible lesson plans, and thus produce generally better results.

Until, that is, they become successful. At which time, they become more widespread, draw in more students, require more teachers, and, being a larger part of the school system, they are absorbed by the desire to standardize. Part, of course, is the general desire to produce generic, predictable education, the same impulse that produces excessive regularity through all the school programs, but here there is another influence at work. Once these programs begin to produce exceptional results, the impulse is to use them as a template, to use them to show other teachers how to get the same results, and thus, rather than allowing other programs the flexibility that actually produced those results, school boards tend to take a snapshot of the curriculum at one point in time, and enshrine it forever as the formula for success, passing it along to less innovative, less inspired (and less inspiring) teachers, and produce a cookie cutter version of what they think "gifted" classes should be. The result, inevitably, is to produce results no better than the standard, standardized curriculum13.

And that is one of the potential ways in which charter schools could be absorbed, and made worthless, were they to be implemented on a widescale basis. However, there is another, far more likely fate awaiting a broader use of charter schools, and, sadly, it follows the same pattern I predicted for general use of vouchers in my essays "Why Vouchers are not the Answer", and "You Don't Drown in a Glass of Water - Vouchers Revisited".

Were the school board truly to decentralize, to allow each school to follow its own course, or allow each district to create its own school program, it might seem, at first, an outcome which I would support. After all, I constantly talk of the benefits of distributed authority, of federalism, of keeping power as local as possible14, so why wouldn't there be considerable benefits to establishing what amounts to charter schools in each subregion, or even each school? Would that not end the many harms I attribute to excessive standardization?

At first, perhaps there would be some small benefit. Admittedly, ending the lock step regularity of lessons many districts impose would allow for teachers to tailor classes to their students, and may allow for some improvements. On the other hand, given that many teachers have accepted a regime which denied them any opportunity to innovate, it is questionable how much novelty many of the existing teachers would bring. (Not to mention the possibility of variations between schools being seen as a basis for suits or at least political complaints, pushing schools toward reestablishing a more standard regime.) But, assuming that the schools manage to avoid the pressures to return to rigid conformity and the teachers rise to the challenge of creating a unique school program, I agree there is a potential for improvement.

However, the problem still remains that schools are, within their region, effective monopolies, supported without reference to outcomes, with no competitors, and so, even worse than with vouchers, there exists the very real potential for terrible outcomes, mostly due to a lack of accountability.

The main problem here is that there are few ways to measure the outcomes of education. Testing works to measure some specific results (whether students know specific information, can perform specific tasks), but tests do have the downside that, when given too much emphasis, they tend to encourage "teaching to the test". That is, teachers tend to drill on those things that will help pass tests, while ignoring not only any facts not on the test, but also many higher reasoning abilities, which tests have a hard time identifying. Thus, many schools which test well still have students who perform poorly in higher education, or in real world situations, as they are skilled at producing specific facts, or applying a set of specific algorithms, but often do not know why to apply them, or when, much less comprehending why those algorithms work and so on.

And, of course, if school districts are serious about implementing charter schools, or allowing some other form of autonomy, then such testing is likely to raise serious objections, not the least being, quite rightly, that extensive standardized tests are, in effect, a means of covertly imposing a specific curriculum, as the list of facts tested will determine at least some part of the subject matter, and thus, the more comprehensive the tests, the less room there is for variation in subject matter taught. And so, any effort to evaluate through testing is likely to run into objections from those who seriously support variable content and method, or, on the other hand, will put a damper on innovation, and thus eliminate some part of the benefits that variation is supposed to bring.

Other methods of evaluation suffer from even worse problems. For example, the use of grade distribution suffers from one serious drawback, that the assignment of grades is in the hands of those being evaluated, which presents them with a quandary. Do they engage in "grade inflation" to make their work appear successful? Or grade honestly and appear to be doing worse than those who do choose to inflate? Not to mention that the schools themselves may produce differing results simply due to the rigor with which they evaluate their students. For example, a "B" from MIT is often taken to imply a greater understanding than an A from a community college. (Whether or not that impression is fair.) Similarly, schools which choose to challenge students may produce overall lower grades than those which do not. And thus, using grades to evaluate schools is a foolish venture.

An alternative to using grades sometimes proposed is to use statistics from the students' subsequent careers, such as admissions to colleges, or their performance at college. But these too present many issues. First, and most notably, they are terribly lagging indicators, especially if we use multiple years of grades. By the time we have evaluated the students, they have been out of the schools being evaluated, at least one year, often as many as four. Even if the school being evaluated has only 3 years, that would mean the performance of 10th grade is being evaluated three to six years after its students have left, by which time both teacher and curriculum have often changed one or more times. Not to mention that it is difficult to tell how much praise or blame one should assign to any specific grade, class or teacher. After all, each student likely had multiple teachers during his stay in school, yet there is but a single outcome, making it hard to determine which teacher played which part in producing the specific outcome15.

But even if we allow for this shortcoming, there are problems using, for example, admissions. The first of these being how to judge what is a success. For example, if none of the students apply for admissions, what does that mean? How does that compare to a class where all applied to ivy league schools but only half were accepted? And what do we do when one school has students who applied to only one or two schools and failed to be admitted, while at another the students regularly applied to many schools and were thus admitted in greater numbers? And how to judge the relative merit of the admissions? If one school has half as many students admitted to college, but at schools generally considered more competitive, how do those compare? And what of cases where students do not apply upon graduation, but who are admitted to colleges later, after a semester or a year outside of school? As you can see, college admission figured may make for good material in a school brochure, may even given a general impression of a school's performance, but they provide a very poor means for comparing schools, much less determining which are succeeding and which are failing.

The same is true of college performance. (Or, for lower schools, performance in high school.) Not only does it present the same problems as admissions, such as how to count those who do not enroll, or delay enrollment, it also combines the problems mentioned above with using grades as a measure. After all, universities are not uniform, any more than secondary or elementary schools, and thus an A at one is not the same as an A at another. And if even comparing the same grades is problematic, how much more so comparing different grades. How does an A at Rutgers compare to a B at Cornell? And when we mix in the many different majors, and the wide variety of courses, the question becomes almost meaningless. Is a student with a C in Intermediate Geology at one university doing better or worse than another with a B in Romantic Poetry at another?

I mention all of this, and make such a big deal about it, because one of the most significant problems in moving from a centralized, uniform system of education to one with variation from school to school, is that there is a need to determine which of those variations is working. In a free market, this is not a problem. Schools are paid based upon the satisfaction of those buying their services, and those which succeed earn money, those which fail don't and must close.

Public education undermines this model. The success or failure of a program is determined by the school board, whose performance is evaluated by voters during periodic elections. Obviously, it does not allow for as fine tuned control as the free market, as one school board sets the agenda for an entire district, but since education is uniform across that district, it still provides some sort of evaluation and control.

However, if each school is free to do whatever it wants, then the school board is in a bit of a bad situation16. They are still the public face of the schools, the only means for parents to change the content of their children's education. And yet, without a way to measure the success or failure of each school, the board cannot tell what is or is not working, and may be taken to task for failures they never knew existed. After all, freedom in school, as with all freedom, means not just the freedom to make good innovations, it also is the freedom to make mistakes. The problem being that the school board may be blamed for mistakes which result from the mistaken beliefs, poor implementation, or even incompetence, of local schools.

Actually, let me make this even more plain, as the problem here is simple. The difference between private schools and a district full of charter schools is the difference between wage earners and teenagers with their parents' credit cards. Private schools and charter schools are both free, but private schools are also responsible to parents, while, under universal charter schools, charter schools would be accountable to no one. Thus, where private schools would need to listen to parent input, deciding in each case if they should accept a given criticism, or if the point is important enough to other parents to sacrifice the tuition of that one student, charter schools will be paid regardless, and can pursue whatever ideas, novelties, madness, whatever, they choose, as they are doing it on someone else's dime, and that dime will get replenished no matter what they do.

Which is why, in the end, I doubt we will ever see a universal movement to charter schools, or, if we do, it will be a token movement, akin to many in-name-only deregulations, with charter schools being covertly regulated as much as the rest of the school district. And I honestly can't see any alternatives. Just as I worry vouchers will mean little more than turning voucher schools into a new tier of public education, universal charter schools, without some strong regulation, would be little more than chaos, a sink for government money funding whatever idea strikes the fancy of a local administrator. And, in the end, such an outcome could not last long, as it would be the death knell of the politicians responsible. And so, over time, charter schools would either vanish, or be drawn back into the regulated fold. Either way, I doubt we will ever see public education moving far from its present, regulated and rather rigid state.


1. The S&L crisis was similarly misused. See "How to Blame the Free Market", "Government Quackery", "Perverting Self Interest" and "Greed Versus Evil".

2. See "Global Warming Watch, Again", "Global Warming Watch" and "Odds and Ends".

3. See "Never Ascribe To Evil, A Discussion of Education", "Why Vouchers are not the Answer", "You Don't Drown in a Glass of Water - Vouchers Revisited", "Reforming Education", "Non-Governmental Communal Solutions", "The Dishonesty of Transportation Spending" and "The Glory of Eisenhower?".

4. See "My Vision of Government", "My Vision of Government Part II", "The Case for Small Government", "Minimal Reforms", "The Political Spectrum", "Deceptive Spectra", "A True Conservative Platform", "The War of All Against All" and "Collective Ventures Versus Government".

5. See "Social Security is Not Insurance", "Selling Yourself Cheap", "Minimal Reforms" and "Not Quite True".

6. No quite equal, perhaps, but still far superior to the performance of public schools teaching the same selection of students. Of course, when we think about the money given to many public schools, many private schools do much the same, since their tuition -- though appearing to be higher than public schools -- actually produces per student funding less than many public schools. (When I attended private school in the 1980s, Montgomery County Maryland paid twice as much per student as my full tuition payment. Granted, that was the highest funding in Maryland, but there was not one county spending less per student than my tuition, and the majority spent a good deal more, most 150% or so of my full tuition payment. How much made it to the students, and how much was eaten up in administration, salaries, overhead and so on is a good question, but the point remains that Maryland could have saved money by sending every student to an "expensive" private school.)

7. See "Why Vouchers are not the Answer".

8. In Baltimore City, Poly and City both enjoy, or at least did at one time (I am not current with my knowledge of city schools any longer) a reputation for higher academic standards and consequently better student performance. I have heard similar stories in various cities and counties. In some cases they were schools in affluent areas -- where many attributed success (often wrongly) to better funding and more involved families -- but in many other cases, they were in regions every bit as destitute and disfunctional as those supporting far inferior schools.

9. The government still has a nostalgia for late 19th century positivist thought. The same philosophy which had experts measuring every movement made by assembly line workers, hoping they could create a mechanistic and repeatable formula to optimize production. I saw this myself when working for a company seeking CMM certification to help obtaining government contracts. The CMM assumes, quite wrongly, that creative programming work can be made formulaic in the same way an assembly line can, and as a result, it so heavily bureaucratizes the process that only top-heavy, overpaid government contractors can truly implement it. But that is a topic for another essay.

10. See "Reforming Education", "Never Ascribe To Evil, A Discussion of Education", "Why Vouchers are not the Answer", "You Don't Drown in a Glass of Water - Vouchers Revisited". , "Collective Action and Government", "Some Thoughts on 'Summerhill'", "Asking the Wrong Question", "In Defense of Zero Tolerance, or, An Examination of Law, Common Sense and Consistency", "'...Then Who Would Do it?'", "De Gustibus Non Disputandum Est", "Big Government, Arrogance and Part-Time Psychopathy" and "Best Practices and Resistance to Change, Bureaucracy and the Free Market".

11. See "The Inevitability of Bureaucratic Management in Government Enterprises", "Organizations as Filters", "Bureaucracy and Arbitrary Power", "Grow or Die, The Inevitable Expansion of Everything", "Fear Driven Enterprises", "Adaptability and Government", "Best Practices and Resistance to Change, Bureaucracy and the Free Market", "Bureaucratic Management and Self-Policing", "Killing the Railroads", "Bureaucratic Management", "The Bureaucratic Mind", "Bureaucracy Revisited" and "The Wrong Solution to Bureaucracy".

12. In general this means that a given county or incorporated city will have a uniform curriculum. State schools boards try to enforce state uniformity as well, but counties seem to have enough independence in most states to resist state domination, meaning there is state-wide uniformity only in a few select areas the state considers most important. Likewise, federal education programs have even less power, and thus federal uniformity is mostly expressed in uniform testing, though that does tend to produce some nation-wide uniformity in terms of preparation for those tests.

13. Back during my very brief time in public high school (I also attended a public kindergarten), we joked the difference between AP History and standard History was that you had to type your papers, and AP students had to use exact dates. Sadly, that was not too far off. The AP classes used the same textbooks, followed the same schedules, even seemed to have the same questions on exams, which makes it difficult to tell exactly what AP meant.

14. See "The Benefits of Federalism", "The Case for Small Government", "Of Ants and Men" , "Why Freedom is Essential", "Single Point of Failure and the FairTax", "An On Demand World", "A Quick Question", "The Era of the Cocky Know It All", "Redundancy as a Protective Measure", "The Importance of Error", "Adaptability and Government", "Inflexibility and Bureaucracy" and "Skewed Perspective , or, How Big Government Becomes Inevitable".

15. This also points out a problem I think plagues all school evaluation schemes, that students have a role in their outcomes as well, and no amount of aggregating data will completely hide that. I grant, a good teacher can often inspire students, but every student is still an individual with free will, and can choose to apply himself or not, regardless of how well a teacher does. And, at times, a given class may, by pure chance, be more or less ambitious than others. Even over multiple years, it is possible for some schools to simply get a run of bad luck. And when we figure in environmental factors, such as parental apathy, the local attitude toward education or hard work, and so on, it is easy for some specific schools to have a harder time than others. Thus, even with the best teachers, and the best plans, outcomes can still fall short. (And, conversely, some schools can do very little, and still have successful students, mostly because of active parents and outside supplements to education.) Patterns of good or bad outcomes may, in some cases, tell us little or nothing about the actual performance of the school.

16. In some ways, it is analogous to the situation discussed in "Chaotic Government" and "Follow Up on 'Chaotic Government'".

Odds and Ends (April 26, 2015)

Well, as of last week, I finally wrapped up the two largest projects at work that had been taking up all my time and keeping me from writing, so, obviously, I immediately fell rather ill for the remainder of this weekend, only now feeling vaguely well enough to do anything other than stare at the television or sleep. It never fails, as soon as I have no more work to do, I get sick. Well, at least it waited until everything was wrapped up with work, and has been kind enough to only last a few days. (Though whether recovering just in time to work again is a blessing or curse is up to you to decide.)

In any case, now that work is much less pressing, and I am, with luck, going to feel better shortly, I hope to finally get around to finishing the many half-completed essays waiting for my attention. So, please check back frequently this week, as I hope to post new material with some regularity.

Friday, April 17, 2015

The Problem with Common Sense Solutions

I have written extensively on the problems of pragmatism and common sense*, usually predicating the argument upon the simple fact that what one persons sees as common sense is often not so evident to others. After all, if an answer were universally self evident, there would be no reason to argue for it. No one has to tell us to breathe or to drink fluids, but if someone has to tell you some political solution is self evident or common sense, that tells me the answer is only so obvious to the person making the argument.

But I have made that case os many times, I often forget there are other issues with supposed common sense solutions. For example, the problem I discussed in "The Problem of the Small Picture", that many supposedly common sense solutions only seem obvious, or appear as "win-win" answers, because the person examining them is unconsciously ignoring a host of costs and side effects, factors which would make the common sense answer look much less appealing were they included.

One of the best examples I have heard is the rebuttal to the argument used to prove supposed negligence for the makers of the Pinto (as well as a number of other small economy cars). The case against the Pinto was established as follows: it was known that under rare circumstances, in this case a full speed rear impact collision, the car might explode. The explosion could be made much less likely by the inclusion of a relatively inexpensive part. However, the manufacturers figured that such collisions are rare events, and so, even if they were sued for every such accident, it would be cheaper to settle than add the part.

Many people dismiss this as heartless, or placing money in front of lives, but the truth is, to make that argument, you must ignore a number of realities. You see, full speed rear end collisions are pretty rare, so there simply will not be a lot of such accidents. However, there are an almost endless number of such rare potential accidents, and all of them could probably be ameliorated by adding some part, in many cases probably an equally cheap piece. But, once you start thinking in that way, that any part which can save a life should be included -- or even should be included only if it costs less than $X -- you have suddenly started building an economy car that costs as much as a Rolls Royce, and weighs enough to bring mileage down to 10 mpg once more.

And that is what these common sense solutions ignored, the decision to not include the specific part was made because the cost, no matter how minimal, was not justified by the rarity of the problem. Yes, in the end, this left subcompacts less safe than, say, big steel luxury cars, but that is inherent in the design. If you want a cheap car with good gas mileage, you are going to sacrifice some safety. Anyone who does not recognize that is delusional. To then sue the makers for not including parts to compensate for every conceivable potential accident is foolish. If they did so, economy cars would become as costly and heavy as luxury cars, making them no longer desirable for their appointed role.

Perhaps another example would help.

When discussing medical insurance, many times people gripe about the absurdity of covering treatments while not paying for prevention. For example, insurance will pay to amputate a diabetic's feet, but not to buy the shoes that would prevent the need to do so. That is but one example, but it is one I heard often enough that it immediately came to mind. Of course, it is not entirely accurate, nor does it tell the entire tale. For example, in a few cases, where individuals suffer from a disease with well known degenerative properties, insurers will pay for certain, specific preventative measures, such as physical therapy, to prevent the worsening of the condition. But, then again, for the most part, the complaint is accurate. Outside of a handful of specific preventative treatments for specific existing ailments, insurers do not pay for preventative measures.

The argument offered against such a position is one that often comes across as common sense, presenting what seems to many a very sensible argument. According to advocates of covering preventative measures, it often would cost only tens of dollars, maybe hundreds, to prevent an ailment, while, on the other hand, treating that same illness can cost thousands, or tens of thousands, at the same time reducing the quality of life of the insured. In other words, not only does preventative care improve the quality of life of individuals, but it saves money for the insurer. According to these advocates, it makes so sense to oppose preventative measures, and only nonsensical bureaucracy, or an old fashioned attachment to the traditional way of doing things prevents insurers from following this new paradigm, saving them money.

That last argument should tell listeners that the argument is not quite as it seems, as, no matter how traditional and stodgy insurers may seem, it is still a business, still answers to stock holders and competes to produce better returns than its rivals. And so, if there were truly an answer that saved money, at least one firm would adopt that practice, producing better returns, and likely inspiring its fellows to do the same**.  If businesses are avoiding a practice that some claim will save them money, then there must be a very good reason, and usually that reason is, quite simply, that the savings claimed do not exist, or are offset by some other costs.

In this case, the answer if actually pretty obvious, once you stop viewing the problem with the myopia of those enamored with preventative care. The problem is the difference between treatment and prevention, a difference that the proponents of prevention gloss over in making their case.

We often hear, concerning treatment, that new technologies and new medicines are making treatment ever more costly, introducing countless new methods of treating a given ailment, and increasing costs tremendously. And insurers, for their part, do their best to avoid some of this cost escalation, mostly by denying coverage for any experimental treatments, and enacting other policies to favor the best combination of cost and effectiveness in treatments. However, even without such measures, there is still something about treatment that makes it attractive to insurers, and that is that treatment is inherently limited. No matter how many possible therapies there are, no matter how many new ideas come out about treatment, there is still an inherent endpoint to treatment, either you get better, or you die. (Or, in some cases, you reach a point where, while still alive, there are no more treatment options available, though this is becoming less common as more and more treatments become accepted.) Thus, even if they allowed patients to indulge in whatever treatment they wished, there would still be an inherent cap upon treatment, and it would be possible for insurers to predict the likely cost of any given ailment for any individual.

On the other hand, prevention is an open ended situation, and in more than one sense. For example, there is no limit to the number of possible preventative treatments you could pursue for any given condition. Unlike treatment where the eventual success of one therapy will foreclose spending on others, nothing prevents patients from undertaking multiple, simultaneous prevention therapies for the same illness.

But it does not end there. Prevention is also open ended in terms of time, as, unlike treatment, which ends with cure or death, prevention has no endpoint, no ultimate success. Until the patient eventually dies of some other cause, the patient can continue to undertake preventative measures for a given ailment for as long as he wishes***. So long as he lives and remains insured, he can continue to spend money on prevention, in the form of multiple therapies intended to prevent a given condition.

Which brings us to the third form of open endedness in preventative measures, the number of ailments against which one can act. Treatment is limited in this regard too. As the terminal events of cure or death provide a cap upon the number of different treatments and the duration of treatment, the need to actually be suffering from an ailment provides a cap upon the number of ailments for which one might be treated****. This is not the case with prevention. Since prevention is intended to prevent the onset of a given condition, there is no logical limit to the number of ailments against which one could take preventative measures.

Doubtless, were they inclined to cover preventative measures, insurers would take the obvious step of limiting preventative treatment to ailments one is likely to contract, as well as treatments which show a preventative benefit proportionate to their cost. However, even were such measures imposed, there would still be a far too open ended market. Even among those diseases which are probable for a given individual, the list is still quite long, and the number of potential preventative measures is equally long. So, even with some sensible limits in place, the insurers would still be facing nearly unlimited spending in the name of prevention. Why, if you really tried, almost everything you do could be covered under the rubric of prevention. Diet and exercise are claimed to be beneficial in preventing a host of problems, so why not have insurance pay for all of one's food and exercise? Similarly, stress is blamed for a host of ills, so why not cover anything that relieves it, from games to vacations to new bedding and a nice wine cellar? After all, they too are preventative measures? In fact, it is hard to think of any aspect of life which could not, with just a little creativity, be turned into a means of preventing disease.

Which brings us back to something I mentioned earlier, the handful of exceptions, those cases in which insurers are willing to pay for measures which we would normally call preventative. For example, physical therapy for those with back problems, intended to prevent the condition worsening.  In these few cases,  we can see why insurance might cover prevention, and also why it would have problems covering prevention in  a more general sense. In these cases, there are a few conditions which make prevention a sensible expenditure. First, the individual is diagnosed with a condition which not only makes the future ailment probable, but also makes the probability knowable. Second, the therapy is well established, specific and has a definite cost. Not to mention that its effectiveness in prevention is fairly well known. Given all of these factors, the insurer can figure how much the therapy will cost, even if carried out for the patient's entire life, versus the expenses of the illness they are preventing. This allows for a well reasoned evaluation of the cost and benefit.

On the other hand, general prevention, with countless treatments, a never ending time line, an infinite range of potential ailments to prevent, presents no means for reasonably calculating the relative costs and benefits. It is, in the end, simply an endless potential drain of resources, which is why insurers are perfectly sensible in avoiding that sort of endeavor, and why preventative care is such a terrible example of a common sense solution. Or, to put it differently, why it is such a perfect example of the shortcomings of common sense.


* See "The Lunacy of 'Common Sense'", "'Seems About Right', Another Lesson in Common Sense and Its Futility", "A Look at Common Sense", "Res Ipsa Loquitur", "The Shortcomings of Pragmatism", "Pragmatism Revisited", "Pragmatism Revisited, Again", "The Plural of Anecdote is Not Data", "Rules of Grammar and Pragmatism", "The Problem of the Small Picture", "Keyhole Thinking", "Impractical Pragmatists", "In Defense of Zero Tolerance, or, An Examination of Law, Common Sense and Consistency", "No Dividing Line", "The Consequences of Bad Laws", "Questions of Law and Questions of Fact", "The Rarity of 'Common Sense'" and "Common Sense,Philosopher Kings, Arbitrary Law and Dictatorship".

** Of course, there is one case in which this would not happen, and that is a possibility when it comes to health insurance. If the government mandates specific coverage, or prohibits others, then insurers may not be able to act in their own self interest. Likewise, if government involvement causes distortions in the market -- eg. how DRGs in government insurance have altered how private insurers handle coverage -- private firms may not act sensibly. But, if preventative care saved so much money, it would seem certain insurers would find a way to implement preventative care and save themselves considerable money.

*** In a way this is a problem with our entire health insurance model. As I discussed in "Redefining Insurance... To Actually BE Insurance", true insurance would pay only for uncommon, unwelcome conditions, such as contracting a disease or an injury. Once insurance begins paying routine, predictable events, such as routine checkups, or preventative measures, it becomes an open ended, endless stream of payments. Since the trigger for payment is not something so unwelcome the insured would never seek to bring it on himself (as with, say, fire insurance, or life insurance), because these events are routine and innocuous, there is no limit on the number of times the insured may choose to use these services.

**** This is also why insurers are reluctant to accept trendy, ill defined diseases which are so often diagnosed based on impressionistic criteria impossible to confirm. (Eg Multiple Chemical Sensitivity) In some cases, there is no evidence these disorders even exist, but even when they do, certain ailments are used by quacks to explain every problem. The reason I mention these is such "free form" diagnoses basically allow treatment to become more like prevention, with no obvious criteria for either infection or cure, allowing open ended, endless treatment. (This is also why, for a long time, psychological disorders were either excluded or seriously limited, and even today are not viewed the same way as physical ailments. Whatever one's view on mental illnesses, the definitions of many are sufficiently nebulous and impressionistic, it is easy for an individual to accumulate a wealth of diagnoses, and equally difficult to ever determine if one is cured. Which is, in truth, part of my objection to modern views on mental health. See "Mental Illness".)

Wednesday, April 15, 2015

The Film School Cop Out and the Excluded Middle

I have written before about common logical errors. ("Gardasil and Logical Errors") I think it is a useful topic, even on a politico-economic blog, as those types of errors often underlie our most common mistakes. For example, the error I am discussing here, that of the excluded middle, or center, is characterized by pretending that two positions represent the whole range of choices, that is, that one must choose A or B, there is no other option. Sound familiar? How about all those FairTax advocates who responded to criticism by asking "so you want to keep the current system?", as if the choice were the FairTax or inertia, ignoring entirely the existence of any other plan, much less the possibility of reforming the current system without adopting another. Or, to switch to the other side of the aisle, how about those who oppose any changes to the current welfare system by arguing the choice is between welfare and people starving in the streets, completely ignoring, just for a start, the possibility of eliminating federal welfare and leaving it to the states, or the possibility of private charity filling in, or, the big one so often overlooked, the fact that ending welfare may motivate many who are technically disabled but still capable of work returning to the labor force. (See "Peanut Butter and Disability") I could go on, but it should be clear by now the purpose in my discussions of logic, that those who understand logic to some degree are less likely to fall into the most common errors, or to be swayed by emotional, but irrational, arguments. (See "Government by Emotion", "Selfishness as Reason - "Wants", "Needs", "Fairness" and Other Guises for Arbitrary Decisions".)

All of this came to mind today while reading movie reviews, when I came across a discussion of the recent tendency to praise films with ambiguous endings for supposedly allowing us to use our imaginations. A number of critics argued against such a position, putting forth that, while it may be effective in a few films to leave the ending ill-defined, most modern films using this pattern tend to show signs of laziness more than brilliance, with the ambiguity serving only to disguise their inability to come up with a decent plot resolution*.

This is where the excluded middle rears its ugly head. Rather than agree that ambiguity is a tool which can be used well or poorly, many who want to defend vague films instead respond to any critic by accusing them of "wanting to be spoon fed endings". Which is, of course, an absolute absurd response, assuming that the only possible way films can be made is either a brilliantly vague ending or an overly simplistic and concrete one. It ignores entirely the possibility of bad ad foolish vague endings, as well as well defined but brilliant resolutions, not to mention all the shadings in between, varying degrees of resolution and ambiguity.

In its form, this argument reminded more than most of the common political uses of this sort of idiocy. For example, when the Ron Paul supporters were more active, arguing against, essentially, any use of the military except (perhaps) the defense of the territorial borders of the US. Whenever one dared to raise a criticism, the immediate response was a twofold use of this specific mistake. First, to accuse the critic of being a socialist, which the Paulists turned into their all-purpose slander, and second, to accuse any critic of empire building. Both, as should be obvious, were absurdly reductionist examples of the excluded middle. First, assuming one must either be a Paul booster or a socialist, with no other options possible. Second, assuming anyone who argued for military action that was not a response to direct invasion was a miniature Napoleon. Not only were they perfect example of this logical error, but, as with film example above, they both also show the most common outcome of the use of this logical fallacy, the excessively reductionist tendency to paint one's critics as the epitome of evil, that is, to make debate, much less persuasion, impossible.

And that is, in the end, why I am bothering to write again. I have written many times that I believe one of our worst tendencies is to see those who hold different political views as rivals, enemies, even the embodiment of evil.  ("The Futility of Blame", "The Inverse of Empathy", "Inconsistent Understanding", "Missionary Zeal and Human Discord", "Should I Laugh or Cry?", "Intellect and Politics", "Ritual Abuse, Backwards Logic and Conspiracy Theories", "The Path of Least Resistance", "Prelude to a Future Essay on Heroic Ethics and Romanticism") Not only does it make for an overly hostile political environment, but it dissuades us from engaging in the most productive political activity, persuasion and debate.  And, because the excluded middle so often feeds into this mistake, I feel especially obligated to point out its shortcomings. Not that I expect this one essay to convince anyone, but still, perhaps if I keep pointing it out, and perhaps a few others pick it up as well, maybe our political environment will change for the better one day.


* For the record, I am largely sympathetic to such an argument. Then again, I am not convinced laziness is the best explanation. Some may be lazy, but many may have bought into the argument that ambiguity indicates brilliance and thus intentionally made bad films, films effectively indistinguishable from the products of laziness or incompetence, all because they think a failure to resolve a plot shows they are clever.

Sunday, April 12, 2015

Why "Hope for the Best, Plan for the Worst" is Bad Policy

I made the mistake today of reading some Washington Post articles. After reading an economically illiterate, and terribly misnamed article, about Nestle supposedly being paid to borrow, which turned out to simply mean bond yields had briefly gone negative -- long after Nestle issued them, and thus had nothing to do with their cost of borrowing -- I ran into an idiotic Dana Milbank essay on "climate change deniers in retreat", which actually turned out to be about lobbyists and others who, to keep "green" clients dropped anti-AGW positions, proving nothing more than that money makes some people parrot AGW lines, as I have argued many times. (See "The Failure of Peer Review", "Publish Or Perish", "Funding and the Corruption of Science", "Debunking 'Debunking Global Cooling'", "Certainty and Pop Science", "The Nonsensical Nature of Some Statistical Analysis") (By the way, since when did the WaPO go in for click bait article titles like some internet startup? Are they really that pathetic? Hard to take them seriously. Next they will put up pictures hinting at nudity in their articles, or cures for cancer. I half expected to see multiple ads telling me lonely Asian women want to meet me.)

Worse still, at least for me, I then went on to read the comments to that essay, and, besides raising my blood pressure, basically learned a lot of Post readers can cut and paste from Skeptical Science, and that apparently a few people still think Mann is a valid resource (see "More About the Hockey Stick Graph", "Sampling Changes and Fictional Trends", "Again Improving Science Misleads", "Some Global Warming Links ", "Very Quick and Simple Logic").

But one good thing did come of this, when I was reading the comments by the AGW boosters, I came across a supposedly "moderate" and "sensible" argument, which I have heard before ("Catastrophic Thinking, The Political, Economic and Social Impact of Seeing History in the Superlative", "All Life in a Day, or, How Our Mistaken View of History Distorts Our Understanding of Events", "GMO? So What?", "A Misleading 'Right to Know'", "GMO Revisited - As Well as Hormones,Soy, Phytoestrogens, and a Host of Other Food Scares", "Transfats?", "'Better Safe Than Sorry' Usually Leaves Us Even More Sorry, And Much Less Safe", ""Salt, Transfats, DDT, Bad Science and Even Worse Law", "The Problems With 'Safe and Effective'", "A Question For Those Worried About Climate Change", "The Problem With Solar Energy", "A Thought On Solar Energy", "God Save Us From Simple Solutions", "Running an Economy on Compost, Saw Grass and Solar Cells", "The Sky's Not Falling Part 1", "Government by Emotion"), used to argue for everything from banning food irradiation and transfats, to passing the FairTax, to regulating various businesses, to outlawing DDT, or other pesticides, and most often in this AGW context, that being that we should take some action just in case the worst is true. In this case, the specific argument was as follows:

What happened to "Plan for the worst, hope for the best"?  
I have read thousands of papers over the last decade and accept what all the evidence is saying.  
As if that evidence isn't scary enough, the struggle with those that can't see it and obstruct attempts to deal with it is even scarier.  
When going all-in and gambling with the planet and the futures of everything on it, uncertainty is not your friend.

As you can see, not exactly a terribly stirring version of the argument, nor one which is especially well worded, but, since I was desperate to find anything worthwhile from my dip into the cesspool of WaPo on line -- and to find an excuse to stop reading -- I took what I could find, and so this uninspiring sample will be used to support my analysis of a popular, but deeply flawed, argument you hear all too often.

I suppose I should explain, in a very general sense, in everyday life, there is nothing wrong with planning for possible disasters. Nor is there anything particularly wrong with setting aside some resources in case things go wrong. A reserve in case of misfortune is a good thing. However, that is quite different from what this argument is inevitably used to support. It is never about setting aside resources in case of an unspecified mishap, it is always used to bolster a weak argument for a specific solution to a specific hypothetical disaster, and that is a very bad idea, as I shall explain.

The best place to start is to ask what "the worst" is. After all, since this argument is never used to support making some allowances for an unspecified mishap (as the quote was originally used, long ago, when people were somewhat less prone to panic), but rather asking us to plan as if a specific "worst case scenario" were true, that being the case, what is that scenario? And that is where you find the first problem with this theory. No matter how bad you think a given outcome might be, there is always someone who can argue a worse one is possible. And so, taken seriously -- in the modern usage of the phrase -- we would end up constantly shifting our focus as worse and worse hypothetical are created, especially since the argument never requires those hypothetical situation be even plausible, much less likely. So, with probability, even possibility, out the window, the only limit to what is our "worst case" is the imagination of those developing hypothetical situations, meaning we will end up planning for incredibly remote, and terribly overblown, disasters, all so, just in case this terribly unlikely event happens, we aren't caught with our pants down.

Even those who argue AGW is true fall into this trap, inevitably arguing that we should pass laws based on the assumption the most pessimistic of computer models are entirely accurate. In other words, assuming that we should simply accept a hypothesis as the basis for action because it has the greatest potential for harm.

Perhaps it would be easiest for people to understand if I couched it in the most basic terms possible, and since the AGW crowd who push this position often behave like Chicken Little, let us view this in terms of that familiar tale.

Chicken Little, as the story goes, was struck by a falling object (I forget what, my son is 10 now, so it has been a while), and assumes the sky is falling. Going into hysterics, Chicken Little agitates the other animals into sharing this belief.

So far, so good. This is exactly the model this argument would have us follow. In other words, according to this idea, Chicken Little and the other animals (who suffer a dire fate, if I recall correctly) are perfectly correct to panic, as the sky MIGHT BE falling, and so should "plan for the worst".

But our theory would actually go even farther. Suppose Chicken Little had a sibling, Big Chicken, who, upon hearing this, argued, no, the sky is not falling, it is not enough to cover one's head or hide, the entire world is falling apart! We need to immediately find a way to bind it together so we can survive! Now, according to this approach, the animals would be sensible to immediately start applying mucilage to everything in sight, as Big Chicken might be right, and we would be foolish to ignore this possibility.

As you can see, the people pushing this position apparently missed the whole point of the Chicken Little story. At least they must have, as their theory amounts to nothing more than maximizing panic. If we respond to the worst case, as we should "plan for the worst", without reference to probability, then we will simply end up all becoming Chicken Little, assuming the slightest sign of danger is a reason to panic, and listening favorably to the worst possible conclusions. That is not a recipe for sound reasoning, or sensible preparations, but for unbridled panic.

That the AGW boosters have been reduced to this sort of argument does not surprise me, a lot of environmentalism follows the Chicken Little model. That they can make these arguments openly, and not realize how foolish they are does.

Friday, April 10, 2015

A Real Headscratcher

I was reading more movie reviews following my last post "Malapropism of the Day", and I came across another malapropism, but this time one I simply can't figure out:
The dusky faded grey filter is a pleasure for the eye to watch and it dowses the actors. 
Now, dowsing is usually a process whereby one uses a forked stick, or metal rods, to find water or valuables. So, does the filter in some way detect water hidden in actors?

Ok, this one isn't particularly funny, and I honestly can't think what the individual was trying to say. On the other hand, it is so cryptic, I could not resist sharing it.

And, as promised, I will do my best later today to get a real, relevant post up.

Follow-Up: Because I was worried there was maybe some obscure meaning to dowse of which I was unaware, I looked it up in every dictionary I could find and discovered, besides the meaning I know, it is also an alternate, and most likely archaic, spelling of douse. So, it is possible that the lens somehow drenches the actors with a liquid, rather than finding hidden water within them. In any case, it is clear something very moist is taking place.


The same review also indulges in one of the most amusing mistakes I see fairly regularly, confusing stigma and stigmata, in this case suggesting that a series of negative reviews may cause bloody holes in the hands and feet of actors.

Malapropism of the Day

I promise I will write something serious and on-topic very soon, I actually have one half finished, and hope to get it done by tonight, but, until then, I have to share a truly silly malapropism I discovered.

I was reading reviews of the film Cabin Fever (don't ask why, it is a long story), and I came upon some misused words that amused me terribly:
Pigs use lots of water and spread disease so the semantic religions shun pigs. 
So, apparently Jews and Moslems worship verbal signifiers? I know there are prohibitions against graven images, and Jews especially, but also Moslems, have serious objections to anthropomorphizing God, but I don't think anyone has suggested they have decided to simply worship the word itself. Or maybe this means that Jews and Moslems are enamored of religious word play? I don't know, kind of hard to imagine what a semantic religion would be. Actually, maybe we could call popular kaballah as semantic religion, since the Sepher Yetzirah (among other works) is so obsessed with the combinations of letters and the words/phonemes they form, especially since most pop-kaballists forget that there is a deeper, spiritual purpose to all the verbal exercises. Maybe that is a semantic religion, the superficial obsession with kaballistic wordplay.

Perhaps it is just me, that I find this so amusing, but something about creating "the semantic religions" struck me as terribly funny. So much so I did not even bother commenting upon the popular, but quite wrong, theory that the prohibition on pork has some sanitary origin. (The truth is, it is forbidden because the Torah says so, just like mixing wool and linen, or milk and meat, or planting mixed seeds together. They are prohibited simply because they are. Some may postulate they developed because of some sanitary practices, but that has no more evidence than those who postulate the rules were established by alien visitors. All we can establish is that Jews and Moslems avoid pork because holy texts tell them to do so.)

Monday, April 6, 2015

Deluded Neo-Pagans

I know it is off topic for my blog, and I should be responding to comments or writing on more relevant topics, but I have to write a bit on the absurd claims often made by neo-pagans that somehow there was a survival of paganism from the classical era right up to the modern age, that is, that "Wicca" (an invention of late 19th century antiquarians, named after a minor Germanic tribe  settled in western England, and not gaining anything like popularity until the late 20th century) is some sort of legacy going back centuries to some real historical antecedent.

I admit, Wiccans are not the only ones. Freemasons love to imagine they are continuing in a lineage going back directly to Pythagoras, Solomon and --in a few extreme claims -- Adam himself. (Though, since Masonry was male only for quite some time, it is hard to imagine who Adam met in his lodge... But that goes back to the question of "who the heck did Adam's children marry?", Genesis does seem to gloss over a few logical problems.) Modern Masons have become a bit more modest, with the rationalists making the -- still pretty absurd -- claim of descent from guilds of medieval stonemasons, guilds no one happened to notice in any historical records, despite a pretty good record for most other guilds. And the romantics, inevitably, find the Masons descending from fugitive Templars, because anything Templar is cool, even more so since The DaVinci Code popularized the Holy Blood, Holy Grail thesis and made the Templars into some sort of neo-pagan hermetic order of their own, though one oddly committed to protecting the bloodline of Jesus, despite a lack of belief in his divinity. (Kind of odd theory, it would be akin to dedicating one's whole life to ensuring the bloodline of Danny Kaye survived for all eternity. If Jesus is not divine, then he is but one of many thousand, perhaps million, descendants of the Davidic line*, and thus one of many potential rightful kings, why would his bloodline be worthy of such attention?)

But to return to the Wiccans, they are more annoying than the Masons in one very noteworthy way, they simply cannot take a consistent position. Sometimes, when feeling feminist and anti-Church, they will tell us, for example, the Salem witch trials were simple hysteria, there was nothing to them but revenge against uppity women or some such. Sometimes they will get a little more delusional and blame it on people fearing "wise women" who worked as midewives or gave herbal remedies, completely ignoring the fact that midwives were an accepted reality throughout Europe and the Americas, and that herbal remedies were common enough that persecuting them would make as much sense as arranging a persecution of CVS employees today. (Though if they continue pontificating about how they won't sell cigarettes for my own good, I may do just that...)
Other times, however, they will suddenly do a volte face, and say "oh, those foolish Christians, pretending witches worship the Devil, there is no Devil in earth religions, and so Salem was just absurd!" In other words, imagining the witch trials were some sort of real persecution of supposed neo-pagan traditions, based upon a mistaken belief that witches worshipped the Devil.

Now, one problem here is quite simply, that "witch" is not a very useful term. In modern times it is often used by those believing in "Wicca", which, as I said already, is an invention of the 19th century. Other times, even in modern eras, it is used for those who do espouse some sort of diabolical belief, and was used as such in the past. And in other cases, it is used for those who adhere to a variety of superstitions, especially those related to divination or harming enemies, which, though very old, were in no way a "survival of pagan practices". Thus, witch can mean a lot of things, and in some cases does mean exactly what the persecutors said. (I know, some Wiccans deny there were ever Devil worshipping witches, but the fact is, there are devil worshippers today -- Anton LaVey grew famous from it -- there are those who confessed to it in the past, even without torture or other inducements, and so, it seems likely that, whatever the frequency of Devil worship, and it likely was rare, there were those who worshipped infernal beings.)

What did not exist was any sort of paganism in hiding lasting from the fourth or fifth century until the present day. Granted, by the fifth century there were still pagans** living in the norther and eastern parts of Europe, as well as in parts of inland north Africa, and, of course, all over the Americas, much of Asia, Australia and sub-Saharan Africa. But, let us concentrate on Europe. In the Mediterranean rim and western Europe, including much of England, Christianity had won out by the mid to late fifth century. Even the invading tribes, such as the Goths, Vandals and Langobards tended to adopt some variant of the faith, though many embraced the Arian heresy, thanks to heavy missionary work outside the borders of the Empire. Still, Christianity was the rule throughout the Roman Empire sometime before Rome itself fell, and even afterward, former Roman possessions tended to stay Christian, with the exception of England, which was occupied by Germanic pagans for about 3-4 centuries, and France, where parts were occupied by non-Christian Germanic tribes for about a century, after which, most did proceed to convert. Outside the Empire, Christianity experienced more difficulty. Germany was largely Christian by the time of Charlemagne's grandchildren, but Scandinavia took longer, as did England (though Wales and Ireland were earlier successes), and eastern Europe had lingering pagan strongholds until the 12th century or a bit later.

Still, if we ignore eastern Europe, as most of these claims of pagan survival do, it is safe to say, by 1000 AD, paganism was no longer a force west of the Oder (we can probably move that line even farther east, but let us be generous here). And in many places, such as the Mediterranean coasts, much of France, parts of the Balkans, Asia Minor and elsewhere, Christianity had been victorious by at least the early 6th century, if not sooner.  So, if paganism was to have survived until the 17th century to inspire the women of Salem, for example, that would mean some sort of underground continuation of old pagan religion, unnoticed by all but a few quirky antiquarians and controversial historian, for up to 11 centuries. That seems quite a stretch.

Let us look, for a contrast, at the native religions on western Africa. The slaves taken in those countries, being dropped in a Christian environment, and one -- at least in the US -- less stringent in enforcing the faith than many states of the middle ages, were largely converted within a generation or two, converted so completely that the remnants of the religion persisted only in disguised form, in syncretic faiths such as Voodoun, Santeria and the like. And before someone claims this is the same way paganism survived, I would point out that, first, these syncretic faiths survived largely because they remained, in some ways, public, able to gain adherents and spread the word. And even then, much of the original faith was lost in the process, with the lingering traces being maintained either in much simplified superstitions, or else in a highly changed form in the new faith. So, yes, Lupercalia did become Easter, Saturnalia (and other winter solstice festivals) influenced Christmas, and so on, but that is about all that remained of paganism, there is little sign that pagans even created a syncretic merging as complex as Santeria. Instead, it seems all that was kept of pagan beliefs were a few holiday traditions, and maybe some common superstitions. That is hardly enough to claim that somehow witchcraft is a survival of pagan faith.

On the other hand, what almost certainly did exist is the thing neo-pagans love to deny, explicit devil worship. And the reason is obvious. If you are unhappy with the status quo, if the church and the state which is heavily influenced by that church, is upsetting, the easiest way to rebel is to embrace that which it rejects. And the medieval church, while not spending much time denouncing the earth mother, or druidic deities, spent a lot of time denouncing the devil. Which is why, in later eras, libertines who wished to create a scandal, and others who wished to signify a total break with their society, embraced devil worship. Some may have done it in a symbolic or joking (or to use the modern sense of the term "an ironic") sense, but still it was the devil, not Lug or Apollo or Artemis or the Magna Mater who made a strong statement against one's cultural traditions. Which is why it is easy to believe not all those accused of devil worship were falsely accused***, while at the same time I find it difficult to imagine they were some sort of proto-hippies carrying on a thousand year old tradition no one noticed until the mid 19th century. (The same way I have trouble believing Freemasons could have existed publicly for about half a century without a hint of a Templar tradition, until some specific lodges "discovered" it, and suddenly all lodges "remembered" they were descended of fugitive monastic knights.)

As I said, I do not deny that the oldest pagan traditions had some influence on early Christianity, especially in terms of specific holiday practices, and maybe some rituals. And, yes, some superstitions may have had an origin in pagan practice, though more likely both Christian and Pagan superstition were influenced by the same basic form of "magical" reasoning****. But that is as far as I can see the survival stretching. And there is one more argument I would offer in favor of this position. Paganism survived the longest in eastern Europe, specifically in the Baltic states (if we limit ourselves to Europe proper). There paganism died out less than 1000 years ago, as opposed to say 1200 to 1300 years ago in England, maybe 1100-1200 in Scandinavia, 1400-1500 or so in much of France (though in some parts even earlier), and 1600-1700 in most of the Mediterranean areas of the former Roman Empire. So, if we were expecting to find witches, meaning witches as survival of a pagan tradition, would it not seem most likely to be in the Baltic states? Yet, when "Wiccans" point to the survival of pagan traditions, it almost always centers on western Europe, perhaps including Italy and Germany. In other words, the places with the longest tradition of Christianity. Does that make even the slightest bit of sense?

Unfortunately, I do not have my usual political or cultural tie in for this one, nor do I have my attempt at a pithy summary, or some brilliant lesson to draw from it. I simply found myself terribly annoyed at those who really believe pagans somehow remained in hiding for centuries and now have suddenly sprung back into public view chanting "an it harm none" and other new age pablum (which has as much to do with real pagan traditions as Jesus Christ Superstar has to do with the historical figure of Jesus). Sorry, but Wicca is as much a survival of an ancient tradition as Disney's Tinkerbell is an expression of traditions about the Sidhe or the inhabitants of Tir na nog.


* Even if we accept the thesis of Baigent, Leigh, Lincoln and company, and assume his unique in being descended of both David and Aaron, and having married into the tribe of Benjamin, given the relatively small population of classical era Israel, the tendency for elites to intermarry, and the fact that fully one twelfth of Israel was of the tribe of Benjamin, even that specific combination could not have been unique, or even exceptionally rare. And if intermarrying with the tribe of Benjamin is so essential, then it would not be hard to take a descendant of Aaron and David and arrange to fulfill this third condition.

** I am using "pagan" to indicate any sort of organized polytheistic belief system, though thinking mostly of those prevalent in Europe prior to Christianity. I suppose, by my definition, even Hinduism is a sort of "paganism". I do not intend it as any sort of value judgment when I use the term, it is simply convenient shorthand since I am discussing the "neo-pagans", to describe their supposed antecedents as "paganism".

*** Of course there is also one other reason to believe in devil worship, the existence of very sincere Christians. If you really believe the teachings of the Church, and despite that wish to do someone harm, to whom else would you appeal but the devil? Thus, the existence of devout Christians argues more strongly for devil worship than some earth mother/green man tradition hiding out unnoticed for centuries.

 **** For example, the idea that if one injures the item that harmed hi, it will speed healing, or the belief that spirits are offended by boasts about one's good luck. These sort of traditions seem to be near universal, and so probably represent a common sort of simplistic reasoning common to the human experience, rather than the survival of any ancient tradition. (See Couliano's Tree of Gnosis and his arguments that Bogomils and Cathars could have reached the same theological beliefs independently, mostly because there are simply a limited number of possible Christian theological positions on specific questions. This seems much the same. Or, for that matter, look at the chapters on thaumaturgy in The Golden Bough. For all its questionable conclusions, some of the arguments for a common sort of primitive reasoning seem to fit historical experience fairly well.)



For those who have been following the blog, I do apologize for doing so little writing. However, work has been crazy. For example, after working late into the evening Friday preparing for a major project going live this morning, then doing some more changes Saturday morning, I spent another part of Saturday, and much of Sunday catching up on other work I had to ignore because of this project. And then, today, I ended up working from 5:30 AM until 4, eating quickly, and then dealing with a call to technical support to try to resolve an issue that caused us to delay our project deployment. So I am afraid I just haven't had the time. Had I not been so inexplicably annoyed at the many absurd neo-pagan comments criticizing the (dreadful looking) TV series "Salem", I probably would not have written at all, but having read those reviews during a few minutes downtime to amuse myself, I found the Wiccan types so irritating I could not resist saying something. Still, tomorrow marks the launch of a second project, and then we have to reschedule the one from today, not to mention the other big rewrite I worked on this weekend which is still waiting, but, I am hoping to find some time to write soon, perhaps alter this week, or the coming weekend. So please check back. And I will get to replying to comments as well, I promise.