God, purposes, and the misuse of probabilities 
Massimo Pigliucci and Mohan Matthen have blogged recently about probabilistic arguments against naturalism and evolution. Recent arguments by Alvin Plantinga and Thomas Nagel begin by considering how likely some development is given only natural causes and evolutionary processes: How likely are we to know anything? How likely was it that there would come to be conscious life? The answer is supposed to be unlikely and that these would be utterly to be expected if there were a God (Plantinga) or if there were purposive, teleological laws (Nagel). From this, it is concluded that there is a God or that there are teleological laws.*

Reasoning like this seems to misuse probabilities in at least two respects. I'll focus on the theological version.

First: It is unclear to me why the existence of God makes sentient knowers more likely than the absence of God does. Of course, the omnipotence of God entails that there will be sentient knowers if She wants there to be, but why suppose that She does? There does not seem to me to be any obvious probability metric over the space of possible gods, and I doubt that the space is well-defined. Theologians have often argued that there is only one possible god, namely God, but their arguments also typically entail that She exists. The probabilistic argument is superfluous at best if it relies on such an apodictic rationalist argument to establish one of its premises!

Second: Even accepting that sentient knowers are more likely given God than not, this only shows that the existence of sentient knowers should increase our credence in God. Whether we should think that She is likely to exist at the end of the exercise depends on the prior probability. There are two ways this could go. (A) Suppose the prior is objective. If the principle of indifference that is doing the work, so that the prior for "God exists" is .5, then we need to have winnowed down the space of alternatives to include only atheism and this specific flavour of monotheism. Again, the argument seems to be falling back on rationalist arguments about the space of possible gods. (B) So suppose instead that the prior is subjective. Then the conclusion is just that someone who believes that God is tolerably likely should, after looking around, be rather more confident. This will not and rationally should not convince those who start with strong atheist inclinations. Those who believe in God place a high prior probability on Her existence, and they might just as well have report that to begin with. The fideism of subjective probability makes the argument irrelevant.

These concerns apply to teleological laws, too. There is even more uncertainty about what would or wouldn't be likely given teleological laws and what the prior probabilities should be, because Nagel's proposal is dismally obscure. At least with God, there is a long philosophical history of worrying over what She might be like.


* Now that I've written it out in this form, it looks rather like the No-Miracles Argument for scientific realism. Naturally I think there are problems with base rates. A quick search also reveals that I've responded to Mohan blogging about Plantinga and Nagel before.

[ add comment ] ( 2176 views )   |  [ 0 trackbacks ]   |  permalink
Digital pictures paper 
Are digital images allographic?, a paper I cowrote with my colleague Jason D'Cruz, has been accepted at the Journal of Aesthetics and Art Criticism. Whatever else might be said of it, it has a first sentence that I am inordinately fond of: "The short answer to our title question is yes, but of course there are complications along the way."

[ add comment ] ( 961 views )   |  [ 0 trackbacks ]   |  permalink
Open fire 
At the Creative Commons blog, there's discussion of a recent report by the U.S. PIRG Education Fund about the the impact of expensive textbooks. It documents something I had observed anecdotally in my own classes, that lots of students decide not to buy textbooks because of the cost and that their performance in classes suffers for it.

I agree with the core of the findings and with the sentiment that open textbooks are a good thing, but there is one aspect that's worrisome.

The study includes this factoid: "82% of students felt they would do significantly better in a course if the textbook was available free online and buying a hard copy was optional. This is exactly how open textbooks are designed."

It does not surprise me to learn that this is what many students feel. However, we know that students do not relate to text on screen in the same way they relate to text on paper. It is harder to read actively and mark up a text on screen; in some contexts, it is simply impossible. Although a free online version is an improvement on an expensive hard copy that they refuse to buy, an affordable hard copy which they buy or print would be even better.

As tablets and e-readers proliferate, this may change, but it would be premature to pretend we are already in that brave digital future.

My own book, forall x, is not designed for online consumption. It is intended to be used as a hard copy, and on line distribution is a way for people to freely get the print-ready files. I use electronic resources similarly in other courses. In history of philosophy, for example, it's a way of cutting out the margin that book publishers and the bookstore would add to public domain material. So the report conflates open access versus commercial books (whether there are license fees or not) with online versus hardcopy (how the student interacts with the content).

[ add comment ] ( 1411 views )   |  [ 0 trackbacks ]   |  permalink
The old Mill run 
The standard account, framed by Ian Hacking and promulgated by almost everyone, is that "natural kind" as a philosophical category goes back to Whewell and Mill in the 19th century. I debunk that account in a paper which has just been published in the The Journal of the History of Analytic Philosophy.

Link: No Grist for Mill on Natural Kinds

[ add comment ] ( 1174 views )   |  [ 0 trackbacks ]   |  permalink
There's a reason they call that guy "Hacker" 
Following a link from Brian Leiter's blog, I happened upon an article in which Peter Hacker defends an old-school conception of philosophy.

As Hacker sees it, there are two things that philosophers might be doing:

The first is metaphysics, enquiry into "the essential, necessary features of all possible worlds."

The second is a priori conceptual investigation, "investigations into what makes sense and what does not."

On the former conception, metaphysics is supposed to be like the sciences in producing facts and findings. The difference is just in whether the findings are necessary (metaphysics) or contingent (empirical science). Yet, Hacker asks, where are the established results of metaphysics? All philosophers have to show for millennia of work is controversy and paradox.

So Hacker advocates the latter conception, on which there are no substantive facts to be gleaned from philosophy at all. Rather, what one learns is that some would-be facts turn out to be nonsense. Yet, I ask, where are the pseudoproblems condemned forever to the dustbin? All philosophers of Hackers' stripe have to show for centuries of work is disagreement and dismissive hand waving.

Hacker's disjunction is plausibly associated with analytic philosophy so called. Claiming that would-be problems are dissolved by criteria of meaning was the method shared by logical positivists and Wittgensteinians, and conceptual analysis is perhaps what gives us the term 'analytic'. And the conception of metaphysics as fundamental ontology and the science of necessity is typically billed as 'analytic metaphysics'.

My rejection of the disjunction is one reason I do not self-identify as an analytic philosopher.

[ 3 comments ] ( 1890 views )   |  [ 0 trackbacks ]   |  permalink

<<First <Back | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | Next> Last>>