Genetics and Ethnic Identity: Some concerns

Following an introduction to the idea that we actually have no genetic relationship with many of our direct ancestors, we will consider what this means for the promise from genetic-testing companies of finding your “ethnic heritage.”

By Michael

Imagine that you are on a theater stage, like the one pictured.

This stage is outside — you are standing in a theater modeled after the Greek and Roman amphitheaters that survive as ruins across the Mediterranean. There is stadium seating for your audience, so that each rank away from the stage is placed about a foot or two above the preceding. Because the amphitheater stretches up and away from you in a rounded bowl shape, each row from the stage has more seats than the one before it.

The first rank of the audience is very unique to you: it seats only your mom and dad. To be more specific, it is your biological father and your biological mother. They have the entire front row to themselves, but stretching back behind them the rows become ever more crowded.

Behind your parents are their parents, in the second row back from the stage. In the third row back from the stage, as you might have guessed, sit their parents– your eight great-grand-parents gathered to see you perform. In the fourth row back sit their parents — your sixteen great-great-grandparents, possibly somewhat confused as to what the fuss is all about.

If we stop there, we can make some observations about your relationships, genetic and otherwise. For starters, there are no siblings present, no step- or half-parents or siblings. Here we see only the most strictly defined idea of parentage. What’s more, we can be fairly certain (to a degree approaching, but not quite, 100% certainty) that you share some human-specific genetic material in common with everyone in the first four rows.

Family Tree Amphitheater
My family tree.

Moving back in the amphitheater, let your gaze move beyond the crowd closest to the stage. In the fifth row sit the parents of your sixteen great-great-grandparents, your thirty-two great-great-great-grandparents. Here we can see a new class of audience members; from this point on, you will find fewer and fewer relatives that have left a trace in your personal DNA. And back and back the amphitheater stretches, each row with double the number of ancestors, reaching a maximum of around two or three million unique ancestors before the population shrinks down to our common ancestors — the first modern humans.

So your ancestors are not related to you? They are, and they are not. They are related to you genealogically, but not necessarily genetically. You exist because they existed (and had children who did the same). They are your direct ancestors, directly above you on your family tree. They are a part of your family. But, still, their human-specific DNA can easily be absent from your own.

To speak precisely, we need precise terminology. The terms “gene” and “genetics” are misleading. For example, it is incorrect to say that “we get fifty percent of our genes from each parent.” We receive one hundred percent from each parent because each human being has the same genetic code. This is why mapping “the human genome” is useful — each of us has the same genes. What is individual to each person is the system of alleles.

Assumed relationship by generation
Assumed relationship by generation

Exactly fifty percent of your alleles come from your mother, and the other fifty from your father. Put another way, fifty percent of your alleles come from your mother’s family, and fifty percent from your father’s family. Your mother and father in turn received fifty percent from each of their parents… This leads us to incorrectly assume that we receive twenty-five percent from each grandparent, twelve-and-a-half from each great-grandparent, and so on. In reality, however, when you have a child, you are not passing on exactly half of your mother and half of your father! Your DNA is not an even four-way split between your grandparents.

Why? Because when you received your fifty percent of your alleles from your mother,  it was not evenly divided between her mom and dad. In fact, there is a low, but non-zero, chance that your mother gave you only those alleles from her mother and not-a-one from her father.

Possible percentages of relationship by generation
Possible percentages of relationship by generation

This is much more common than most realize! Geneticists estimate that once in every four million births, a child has no genetic relationship with one of their grandparents. In the United States alone we might expect to find some 80 people in that situation. The likelihood increases dramatically when we only consider the alleles that we can see with the naked eye: the shape of your nose, the pattern of your freckles, the color of your eyes. Science, then, agrees with reality. How often have some cousins noticed that they share a stronger resemblance with each other than with a sibling?


The past is gone. If ethnic identity was formed in the past, how does it still hang around? Answer: because it is constantly reinvented in the present by those alive in the present.

That the past is no longer present is difficult to accept for many people. We recapture the past using many resources — memory, the written word, ruins, and legends. These sources often glimpses of the past, with limited accuracy. In other words, the past is not a very safe place to store one’s identity.

When companies like 23andme, Ancestry.com, National Geographic, Helix.com, and MyHeritage.com offer to explain our identities to us, we should understand the insurmountable limitations they face in doing so. This is especially important to remember considering that these companies charge a lot of money to analyze cheek-moistened Q-tips (starting around $80 and moving upward of $500, depending on the services purchased).

Not a real ad -- but one found in the wild
Not a real ad — but one found in the wild

There are three genetic tests on offer: testing Y-chromosomal history (available only to holders of Y-chromosomes, i.e., “biologically-defined men”), testing mitochondrial DNA (mtDNA), and testing your alleles, your “autosomal DNA.” The first two are very specific: Y-chromosome tests reveal only your patrilineal descent, while mtDNA only reveals your matrilineal descent. Both the Y-chromosome and the mtDNA is passed between generations with very few mutations or changes, making them useful for archaeologists and less useful for genealogists. For example, thanks to mtDNA and Y-Chromosome testing, geneticists have estimated that the most recent common ancestor for every human being on earth possibly lived only 2000 years ago.

genetic-tests
After only a few generations, these tests are telling you about a vanishingly small minority of your ancestors.

Haplogroups refer to these two types of tests. They derive from the slow mutation of the Y-chromosome and mtDNA — changes that happen very rarely. In the case of mtDNA, scientists in one study observed an average of 1 mutation (“substitution”) per thirty-three generations, though skepticism remains because the observed rate in genetic studies is much, much slower [quoted here]. Y-chromosomes mutate considerably faster. However, both mtDNA and Y-chromosomes change much more slowly than the genome at large, which is scrambled and re-assembled every generation — so that both of these are used to signify “haplogroups” by geneticists, showing general relationships over large populations. Haplogroups are not synonymous with ethnicities, races, nationalities, religions, or other identities. Indeed, Y-chromosome haplogroups and mtDNA haplogroups are quite different (examples HERE and HERE) and do not add new information for the consumer of genetic testing — they largely confirm (or at least do not contradict) the Out-of-Africa hypothesis of human prehistory.

It is the third test that allows people to find more recent relationships and connections with “ethnic identity.” Unfortunately, such tests are limited to the above percentages — what do you do with the fact that so many of your recent ancestors (some of whom lived only 100 or 200 years ago) are not even present in your DNA? Ancestry.com divided the world up into 26 ethnic/genetic provinces, each determined by small samples, sometimes as few as a dozen, rarely more than 100. [Quick read on this here…]

And who is representing these ethnic identities? 23andme and similar companies state that if a subject has been genetically tested and affirms that their four grandparents were from a specific location, that person today is counted as a “native” of that location. In other words, when your genetic results claim that you are 57% Northern European, they mean that your markers have so much in common with the people currently living there… but again, what about their own genetic heritage? They, too, have many in their past whose DNA is absent from the present.

The fact is that the computer analyzing your DNA does not spit out your “identity,” but rather a long string of numbers and letters. This string is in turn analyzed by an algorithm searching for matches in a growing (but finite) database. That matched information is then sent on to a human being to interpret — and this is key. Should you send the same DNA to multiple testing sites, you will find different ethnic backgrounds! For more information on the process of interpretation, see this website. How does this happen?

Simple answer: because ethnicity is not based on science. Science can measure and explain the chemical makeup of your DNA, but it cannot (should not?) measure and explain your genealogical, “ethnic” identity. And much of this science-and-identity is unfortunately filtered through profit-driven companies, to boot…

So, can you tell your ethnic identity from a DNA test? No, not the way you think it can…

If you would like to read some typical “Genetics as Proving Ethnic Identity” literature, there are many, many examples online: Here Here Here Here and Here for starters.

The Problem with Memes

By Michael

Do not make or share political memes.

Doing so is not mentally hygienic—indeed, our mental health depends on our ability to recognize when and how we are manipulated, by ideas we oppose and especially by ideas we support. There may be some exceptions, but I will not veil the thrust of this post: I think we should change our collective behavior and I have some compelling reasons to back up this argument. I’ll briefly introduce the terms—what I mean by a meme, what I don’t mean by a meme. From there I’ll bring as many rational reasons to the table as I can to convince my friends and readers that memes, in this meaning, are toxic and best avoided.

When I say meme, I mean an image with blocky, sanserif text on the top and bottom, often white text over some representative photograph. Here is a gallery of relatively innocent, funny, entertaining memes. These are not the problem.

This slideshow requires JavaScript.

A technical term for this subset of memes is ‘image macro.’ The process of making these image macros became automated and free with the launching of websites like MemeGenerator [.net] and its clones (e.g., makeameme.org). Many memes began life on the far corners of the internet removed from social media, websites like Reddit and 4Chan. But, in the previous year or so, political memes have become more and more popular. People are creating memes specifically for spreading over Facebook, Instagram, and other photo-sharing social media platforms. The basic idea of a meme is not that old – Richard Dawkins, the evolutionary biologist and atheist activist, coined the word to have some way to refer to a unit of cultural production analogous to genetic material. Successful memes go through a process of variation, mutation, competition, and reproduction akin to viruses, bacteria, and complex life. Unsuccessful memes go extinct.

Thought Germs

In this vein, I think anyone that set aside a moment to read this blogpost would benefit even more from setting aside seven more minutes to watch the video below: “This Video Will Make You Angry.” It will not actually anger you. Instead, it explains how memes and videos can be better understand as “thought germs.” The most important sentence in the video reads: “Just as germs exploit weak points in your immune system, thought-germs exploit weak points in your brain, aka emotions.” The video creator CGP Grey has an excellent matter-of-fact narration that clarifies this theory and its many branching conclusions. The other take-away from his video is that “Being aware of your brain’s weak spots is necessary for good mental health, like knowing how to wash your hands [to fight germs].”

On to the issue of political memes, there are several issues I want to bring up. We will consider their creators, their target audience, their apparent goals, and the consequences of their continued existence. I want to move beyond the simple arguments like the kind found in this article from the Duke Chronicle listing their pros and cons. If I were to edit that article, I would point out to the author that most of the points listed as pros could easily be recast as cons. One can find many articles defending memes, arguing instead that “people uses these forms for political expression.” I would counter that political expression is not equivalent to conversation—burning a book is an expression and not the equivalent of reading it or arguing that it should not be read. Sharing a meme is like burning a book because posting such images expresses a political position without the possibility of constructive discussion.

Communication

Conversation and communication are the most important tools for a good life among other human beings. If we fail to explain what we want, we are not likely to get it. We are never free of the need to articulate ourselves. No one can read your mind—and even if they could, you cannot read theirs. In computer programming terms, there is no language that will free you from having to articulate what, exactly, you want to do. The same is true in life – people who fail to communicate their ideas are trapped either doing everything themselves, or doing without.

Here I have gathered a gallery of memes which we can separate into “leftist” or “rightist,” except perhaps for those that suggest both parties are full of idiots.

This slideshow requires JavaScript.

We need to discuss, to talk, to argue, and give each other the respect owed to any human being. There are those that are violent, that will threaten us, that are not interested in conversation — there is no denying that. But they do not outnumber us, nor are their numbers specific to any one side of the debate. This does not mean opening ourselves up to abuse, but it does mean making ourselves vulnerable to challenge, to having our ideas and biases laid bare.

These memes are much more likely to anger than to convince. So many of them put a picture that is engineered to lay border-fences between groups: the liberal ones feature liberal celebrities distrusted by the right, and the conservative ones similarly post pictures engineered to incite fear or distrust by the left. If my aunt or cousin or friend has an idea shared by someone I already distrust that is one thing: they are still they themselves. Why not put something in your own words, instead of using a celebrity whose fame stems exactly from their divisive nature? For example, I am uncomfortable with some of the things Bill Maher has said — he is unapologetically anti-Islam and distrustful of religion. I know he, like me, has some left-of-center ideas, so I will listen to him some of the time. But for people on the right, he is a rage-inducing talking-head.

Similarly, the pictures we see are chosen for specific poses. Trump is a godsend because of his incredibly mobile face — he can look calm and compassionate in right-leaning memes and like six kinds of clown-faces in left-leaning memes.

Bumper Stickers

Bumper stickers were memes before there were memes. They crystallized political, religious, and sociological tenets into a space measuring less than four inches by fifteen. The nicest, most charitable description of bumper stickers is that they allowed the owner of the car to communicate some basic message. More than that, however, they set boundaries. Imagine two connected, controversial viewpoints, that we will call X and Y. They could be abortion rights (X) and the sticky issue of when a human (soul) becomes worthy of consideration and gets its own recognition from the state (Y). They could be federal supreme court appointments (X) and the issue of whether or not supreme court justices should be life-long positions (Y). If I saw X bumper sticker on your car, I might assume it would be unwise to bring up Y in conversation with you… which is counter-productive, since in reality you would most likely enjoy a conversation on Y. You are probably versed in its arguments and reasons, considering how much they know about X and how it relates to its opposite, Y. You care about X, which means you probably care just as deeply about Y.

On a car, I think, you can argue a bumper sticker is okay because we cannot talk between cars, between strangers. The only chance I have of expressing myself in the anonymous crowd of bystanders and other drivers is to write some big text on my car’s backside… or put up my own do-it-yourself billboard, as many in rural areas adjoining interstates do.

Facebook (and other social media platforms) have ZERO need of bumper stickers. The whole point is to communicate with each other. Rather than using the memes generated by strangers to communicate with our actual friends (and Facebook friends), we should use our own words. Instead of letting someone else put these ideas and feelings into action, we should own our beliefs ourselves. I do not care if you got your ideas from that book or that actor or even that meme — but tell me yourself.

Money and Click-bait

If the importance of communication and argument is not convincing, consider the fact that by sharing memes we allow others to make money off our inability to articulately our own beliefs. The road to fortune is littered with the burnt-out shells of countless people struggling to turn meme-creativity into money… but the model is there. One can find “howtos” and write-ups explaining how current millionaires made their money by crafting Lolcats and UMadBros. At its most simple, content-producing sites make money by selling as much advertising as they can fit on the screen. Many of these sites have Facebook pages and legions of subsidiary Facebook pages (both business and personal) — all of them producing, sharing, and cultivating streams of memes. Clicking on them will take you to their last sharer, and within one to three clicks you are on someone’s business page, reading drivel while your page-click has made someone $0.03. It might not sound like much, but this is the model that drives Click Bait…

And Click Bait is probably a more serious problem than memes… but it has been so well covered over the last few years, there is little I can add to the conversation. I wrote this post because I have seen political memes increase on both sides of the proverbial aisle on my Facebook feed and thought I might convince some people that they are doing themselves and our society a real disservice by sharing such memes.

Stickers & Shoves — Generosity and Hypocrisy in the news

By Michael

Hypocrisy today is considered by most to be a serious sin, “the sin of pretending to virtue or goodness.” The word has an interesting etymology.  The word appeared as ὑποκριτά, or ypocrita, when the two Evangelists Matthew and Luke recorded Jesus at the sermon on the mount. Being a hypocrite in Christ’s time usually meant to be an actor, to take on a different persona. Hypocrites have always ‘played the part,’ except that thanks to Jesus admonishing those who pretended to be just, it received a darker meaning. A hypocrite today is someone who lies about their morals, who pretends to be better than they are.

A New Study

In very recent news (in early November of 2015), there has been some schadenfreude in the secular press regarding a University of Chicago study. Loosely speaking, a small team of scientists found evidence that religious people are hypocrites. In the words of Dr. Jean Decety, the primary author,

A common-sense notion is that religiosity has a positive association with self-control and moral behaviors. This view is unfortunately so deeply embedded that individuals who are not religious can be considered morally suspect. In the United States, for instance,    non-religious individuals have little chance to be elected to a high political office, and those who identify as agnostic and atheist are considered to be less trustworthy and more likely to be amoral or even immoral. Thus, it is generally admitted that religion shapes people’s moral judgments and prosocial behavior, but the relation between religiosity and morality is actually a contentious one, and not always positive.

I have several concerns about the truth of Dr. Decety’s statement and the conclusions of his study. Dr. Decety, who identifies as both French and American, was previously best known for his studies into the nature of empathy in his field of neuroscience. While I have great respect for his work and current position at the University of Chicago, I would like to post some of my reservations publicly.

In the first case, I feel that Dr. Decety is not challenging his own received premises adequately. For example, in the above quotation, Decety points out that non-religious citizens of the United States are unlikely to win public office. He suggests that the non-religious are considered to lack the positive associations with morality and self-control enjoyed by the religious. Unfortunately, he offers no proof for his feelings in this case. It seems to me that one could actually imagine many reasons why the United States has yet to put a non-religious person in power. First, the vast majority of Americans are religious, whether or not they belong to so-called “organized” religions. A non-religious person might hesitate to run for office when they do not represent that aspect of the American majority. The religious majority crosses race, gender, class, and other category boundaries. Second, many  non-religious people of varying levels of fame have made public dismissals of religiosity, generally in rude statements of unveiled superiority. I would argue that the perception of poor morals cuts both ways. Some religious people may believe the non-religious to be amoral (a statement I do not accept in such blanket terms). However, one could find as many examples of the non-religious considering the religious to be worse: amoral and hypocritical.

Hypocrisy, Generosity, and Morals

My experience and life thus far informs a different point of view. Namely, all of humanity has the same capacity for hypocrisy. The non-religious have a similarly high call to morality from the universal principles of the Enlightenment best enshrined, perhaps, in the ideals of secular humanism. If one only pretends to live up to the expectations of the community and the social contract, that one is a hypocrite. The religious are, of course, also susceptible to hypocrisy. Jesus Himself saw fit to call His disciples hypocrites.

39And He spake a parable unto them, Can the blind lead the blind? shall they not both fall into the ditch? 40The disciple is not above his master: but every one that is perfect shall be as his master. 41And why beholdest thou the mote that is in thy brother’s eye, but perceivest not the beam that is in thine own eye? 42Either how canst thou say to thy brother, Brother, let me pull out the mote that is in thine eye, when thou thyself beholdest not the beam that is in thine own eye? Thou hypocrite, cast out first the beam out of thine own eye, and then shalt thou see clearly to pull out the mote that is in thy brother’s eye.

I love the King James Bible and its rich imagery, though I make sure to use multiple translations when I read the Bible. The above passage from chapter 6 of Luke’s Gospel seems sound to me.Whether the image is of a mote versus a beam, sawdust versus a plank, a speck versus a log — the point is quite plain. Do not pretend to judge your fellows and, if you do so, know that you are a hypocrite. It is not a terrible state of affairs. Rather, it is the human condition — we are trying and inevitably failing to be like Christ. Just because it is nearly impossible to live up to the ideals of the sermon on the mount does not mean we should not try. I think that one could argue that being labeled a hypocrite is very nearly a prerequisite for becoming a good Christian.

When the old law proscribed the practice of an Eye for an Eye, it was Jesus who suggested that when someone slaps you on one cheek, turn to them the other cheek also. When someone asks for your shirt, give them your coat, too. If you are forced to walk one mile, walk two with that person. Love your enemies — how stupid is all of this? So much of the modern era is the struggle between the logic of the world (which changes and shifts) and the logic of the next world (whose interpretation also shifts with the times).

When we get down to the details of the study, the results themselves are somewhat more complicated than many of the journalists suggested in their coverage–which, as usual, has been misleading and awful, serving only to inspire me to find out for myself. I do not suggest clicking any of the following links, but some of the headlines included:

Before we get to the study, I think that every single one of the above stories is wrong. The study found that religious children are slightly less generous than non-religious children. It found that religious children were more likely to view interpersonal violence as mean. It found that while some religious children were more likely to proscribe stronger punishments for such violence, others — notably Christian children — were less likely to ask for strict discipline. If that’s the case, how do we get the above headlines?

Details of the study

The study involved 1,170 children, ages 5-12, from seven major urban centers around the world, including Chicago, Istanbul, and Guangzhou. More specifically, there were roughly 200 children represented from six countries — only six because Turkey’s study was divided between Istanbul and Izmir. The median age was approximately 8 years and about 47% of the participants were female. 280 were labeled Christian, 510 as Muslim, and 323 as non-religious.

The study consisted of two games for the children, outside of various demographic and categorizing surveys for the parents and children. In the first game, a variant of the “dictator game,” children were given a sheet of stickers and told to select ten to keep for themselves. Having done that, a research informed each child that, of those ten, they would need to share with other children. There weren’t enough stickers to go around! This is considered a test of altruism, or the principle of giving at a cost to the giver. The largest conclusion of the study hinged on the fact that the non-religious children gave away an average of four stickers to their classmates, while the others gave away an average closer to three. Quoting the press release, “in the study, children growing up in households that weren’t religious were significantly more likely to share than were children growing up in religious homes.” I believe that this is a confusing use of the word ‘significant.’ In statistics and related fields, the fact that a difference or statement is true can be measured in its ‘significance,’ which is the meaning in the quoted statement. The religious children in the study did, in fact, give fewer stickers. Did they give significantly fewer stickers? No, the difference was less than a single sticker. It is a “significant” difference because it is a real difference, but it isn’t a “significant” difference because it was a small difference.

The second game of the study measured “moral sensitivity” by showing videos to the children that depicted one individual pushing, bumping into, or aggressively shoving another individual. The researchers then asked the children two questions. First, the children were to judge the meanness of the actions on a set scale. Second, the children were to assign an appropriate punishment from a list of possibilities. The study found that religious children saw the various actions as being meaner than the non-religious students. With regards to punishment, Christians and non-religious children were too close in their answers to establish a difference, while Muslims tended to hand out harsher punishments.

…paired comparisons showed that children in Muslim households judged interpersonal harm as more mean than children from Christian  and non-religious households, and children from Christian households judged interpersonal harm as more mean than children from non-religious households. Moreover, children from religious households also differ in their ratings of deserved punishment for interpersonal harm; this was qualified by significantly harsher ratings of punishment by children from Muslim households than children from non-religious households. There were no significant differences between children from Christian households and non-religious households.

Children from Religious Households Judge Interpersonal Harm More Severely Than Children from Non-religious Households
Children from Religious Households Judge Interpersonal Harm More Severely Than Children from Non-religious Households
Some problems with the study

There are many conflicting points within the study, from my point of view. In no particularly order, I would first point out the small size of the numbers in the study. For me to accept that religion teaches a lack of generosity, I would expect a study with tens or hundreds of thousands of students from similar backgrounds but different religious affiliations. If, in that study, the religious children shared only 3 of the 10 stickers and the non-religious children shared 7… that would be significant in both meanings of the word. However, if such a difference truly existed, we would not need a study of sticker-sharing to illustrate it.

Second, let us challenge the idea that these games adequately prove the conclusions of the scientists. Is the sharing of stickers a universally accepted and equivalent practice? Do stickers signify the same kind of toy, the same kind of gift between the major urban centers of the world? I am doubtful that that is the case. I expect such disparate cultures also to view the nature of bumps, pushes, and shoves in interpersonal behavior? We should not equate culture with religion. Moreover, there is the problem of those conclusions which were uninteresting to the scientists. Their study found that Christians were more likely to mete out less-strict punishments for actions they felt were more harmful. In the surveys given to the parents of the children, Christian parents were more likely to describe their children as sensitive to injustices in the world.

Third, the children were asked by adults to judge the actions of people they did not know in a series of videos. The study found that

religious children judged interpersonal harm as being meaner and deserving of harsher punishment than did children from non-religious households.

We do not have at our disposal the videos which the children watched. We do not know what kinds of punishments the children were offered to hand out to the transgressors. This is problematic because I believe the average person would want a child to recognize wrongdoing, apart from their desire to right wrongs through punishment. The researchers did not collect qualitative data, meaning they did not ask the children to explain themselves and their actions. Their responses are the only measure of their supposed meanness and strictness. How can we account for the empathy of the children? To what degree did they feel for the person being pushed, as opposed to feeling vindictive against the aggressor? Indeed, the researchers directly gave permission to the children, instructing them to judge total strangers.

Fourth, I have concerns with how adequately the study accounted for the backgrounds of the various children. The study claimed to “account” for the “country of origin,” but no specifics were given for that accounting.

To be more specific on this point, let us consider the numbers. I consider it safe to assume that majority of the non-religious children (about 300 total) likely came from China (providing about 200 of the total). Similarly, it seems like that the majority of Muslim children came from Turkey and Jordan, while the Christian children likely came from Chicago, with likely pluralities from Toronto and Cape Town. There were also minorities of Jewish and Hindu students included in the religious category, and I assume that Cape Town and Toronto provided the most diverse populations. In other words, there were some pretty huge cultural divergences apart from religious identity. Would it not be possible that Chinese children might have a great likelihood to share their stickers with the community? And that this was not because they were non-religious but rather because the social network in China is quite different from that found in other countries, unrelated to the existence or nonexistence of Christian, Muslim, or Jewish communities?

Fifth, there is the issue of forgiveness. The study found, for example, that Christian children were more likely to view actions as “mean,” and to a greater degree. Yet, simultaneously, those same children were less strict in meting out punishments to the offenders. To reiterate: Christian children gave lighter punishments to those whom they identified as being deserving of punishment. However, the authors of the study did not highlight that finding — I would argue it is because it did not fit the larger anti-religion narrative they supported.

Having cast some doubts on the process and conclusions, I would close by criticizing later comments by Dr. Decety. Namely, he suggested a possible reason for the perceived lack of generosity by religious children — moral license. This theory suggests that when people perceive themselves as doing something good, it leaves them less concerned about the consequences of immoral behavior. In this case, however, Dr. Decety implies that the children view their own religious identity as a self-righteousness which allows them to be less generous. This goes against the study itself, which only identified the religious identity of the parents, so that the children were no more aware of their religious identity than they were of their ethnic, gender, class, or other identities. In this case, I feel that overly broad conclusions were drawn from too little data. I also hesitate to expect convincing arguments about the altruism or kindness of various sets of people coming from a journal like Current Biology. Explaining that is as simple as pointing to my own research in the humanities. I think there are many questions which the hard sciences do not, should not, can not ask or attempt to answer.

The good news from this study is clear enough to me: all children share! The amount by which they show their generosity increases with age and does not differ significantly between various religious affiliations. In my opinion, this study offers examples of the variations in generosity and forgiveness between groups of people, but offers no significant evidence for differences between religious and non-religious people.

Tense Decisions

By Michael

Writing is very difficult. Writing will be very difficult. Writing has been very difficult.

The English language allows for a very complex set of time-stamped situations. For the purposes of illustrating that point, I will place my verbs, modal verbs, and phrasal-verbs in bold throughout this post. When in combination with hypothetical situations, the speaker/writer uses something called the Conditional–and its myriad of forms–to create intricate descriptions of what-ifs, might-have-beens, and counter-to-facts.

Had I only been there a moment sooner, everything would have turned out for the better.

Writing about history, then, would seem to doom the writer to an eternity of wading through the swamp of tenses, filled with eddies and shifting currents of whens and who-said-whens. So, when should a writer use the past tenses versus the present tenses? While common sense instructs one to use the present to write about the present and the past to write about the past, there are many conventions that demand otherwise. This post contains some observations gleaned from writing my dissertation. Continue reading Tense Decisions

Dissertation Writing

Maintaining concentration is a problem on more than one level when writing my dissertation. I cannot know how widely spread this problem might be among other PhD students. But, maybe if I describe it here, other students could benefit. This is not a post about dissertation writing secrets, plans, or advice. There are plenty other places on the internet and through university counseling centers to get others’ opinions in that regard. This post is more about a brain-challenge, a thought-hurdle that stands in my way. Continue reading Dissertation Writing