I know, I know; the actress in the first photo is a bit older than the actress in the second.
I know, I know; the actress in the first photo is a bit older than the actress in the second.
Here’s an interesting poll on the subject:
Over the past decade, Americans have clustered into three broad groups on global warming. The largest, currently describing 39% of U.S. adults, are what can be termed “Concerned Believers” — those who attribute global warming to human actions and are worried about it. This is followed by the “Mixed Middle,” at 36%. And one in four Americans — the “Cool Skeptics” — are not worried about global warming much or at all.
The “interesting” part is that, except for that “mixed middle” group (which has less education than the others), the groups don’t differ in their amount of education. Also, men are much more likely to be in that “cool skeptic” group (as are Republicans, which is no surprise at all—and men are more likely to be Republicans anyway).
Also interesting is the following chart, which tracks changes over time. You can see how the “cool skeptics” group has grown and the “mixed middle” shrunk. The “concerned believers” segment has had a few ups and downs, but has ended up pretty much where it began:
According to Peter Burrows at Business Week, Stanford law professor Joseph Grundfest remarked, when Brenan Eich was forced to resign as CEO of Mozilla, “This is a particularly fascinating situation, because it involves an illiberal reaction from a very liberal community.”
The “very liberal community” of which Grundfest was speaking was either the Silicon Valley computer industry or the gay activist movement, or possibly both. But his use of the word “liberal” in this context betrays a misunderstanding of the double meaning of the term. Yes, these communities call themselves “liberal” or even “progressive.” But their liberalism breaks down into two opposing groups: one that espouses PC thought and considers conformity to it necessary for right liberal thinking, and one that values individual liberty above those concerns.
The first group was in the forefront of the anti-Eich forces. The second includes those who embraced and promoted the legalization of gay marriage and yet were made uneasy by the Eich witchhunt and its success. They do exist; I saw them on discussion boards back when the Eich controversy was at its height. However, modern liberalism, sadly, seems to contain far more of the former group than the latter, and there is hardly anything so illiberal as a “liberal” bent on stomping out opposing thought.
How could Grundfest have made such as error (and perhaps he didn’t; I’ve not been able to locate his quote in fuller context)? After all, he’s a member of the American Enterprise Institute and the Hoover Institution, and ought to understand full well what what so many “liberals” are about.
Curiously, this short article from 1989 identifies Grundfest as a “conservative Democrat.” That phrase alone should tell us how profoundly times have changed since then; the appellation “conservative Democrat” would be oxymoronic today.
SCOTUS has ruled 6-2 (Kagan abstaining) that Michigan’s law against race-based affirmative action is constitutional.
In other words, it’s not racial discrimination to ban the sort of racial discrimination that is supposedly designed for the purpose of redressing racial discrimination.
Mind-boggling that the question even comes up. As Justice Scalia, joined by Thomas, wrote:
It has come to this. Called upon to explore the jurisprudential twilight zone between two errant lines of precedent, we confront a frighteningly bizarre question: Does the Equal Protection Clause of the Fourteenth Amendment forbid what its text plainly requires? Needless to say (except that this case obliges us to say it), the question answers itself. “The Constitution proscribes government discrimination on the basis of race, and state-provided education is no exception.” Grutter v. Bollinger, 539 U. S. 306, 349 (2003) (SCALIA, J., concurring in part and dissenting in part). It is precisely this understanding—the correct understanding—of the federal Equal Protection Clause that the people of the State of Michigan have adopted for their own fundamental law.
Justices Ginsburg and Sotomayor strongly disagree. Professor Jacobson writes at Legal Insurrection:
Here’s how Justice Sotomayor framed the issue: Taking away racially sensitive admissions uniquely harms those who benefit from that sensitivity…This is, as Kurt Schlichter calls it, essentially a ratchet theory, that no preference ever can be rolled back otherwise the rollback is discrimination.
I would add that Justices Sotomayor, Thomas, and Ginsburg all have personal experience with affirmative action, or lack thereof in Ginsburg’s case. Sotomayor and Thomas have acknowledged benefiting from affirmative action, although Thomas has indicated he felt it meant that people doubted his credentials for getting into law school. As for Ginsburg, who is older and went to law school before affirmative action existed, although her academic record was stellar she was discriminated against when she tried to get a job in law.
Thomas’ statements about his experience with affirmative action have been especially powerful:
When Thomas applied to Yale Law School, his race was taken into consideration. He wrote in his book, “I asked Yale to take that fact into account when I applied, not thinking that there might be anything wrong with doing so.”
But Thomas says that after he graduated from Yale, he went on several job interviews with “one high-priced lawyer” after another and the attorneys treated him dismissively. “Many asked pointed questions, unsubtly suggesting that they doubted I was as smart as my grades indicated.”
The fact that he couldn’t get a job would shape his thoughts on affirmative action programs for years to come. Thomas wrote, “Now I knew what a law degree from Yale was worth when it bore the taint of racial preference. I was humiliated—and desperate.”
In his interview with ABC News, Thomas said he was unable, even when he was nominated to the Supreme Court, to erase the stigmatizing effects of racial preference. “Once it is assumed that everything you do achieve is because of your race, there is no way out.” he said. “…it is irrebuttable and it is proved to be true. In everything now that someone like me does, there’s a backwash into your whole life is because of race.”
I am certainly not suggesting a one-on-one relationship between any of these justices’ positions on affirmative action and their own experiences with it; their viewpoints are in line with their general liberal/conservative orientation. But I do find their experiences interesting. My own personal reaction to affirmative action, back when I was a liberal Democrat and it first came into play, was antipathy on the order of “two wrongs cannot make a right.”
[NOTE: In reading the article about Ruth Bader Ginsburg's life, this caught my eye:
She credits another professor at Cornell, Vladimir Nabokov, with influencing her reading habits and writing style. “He loved words … the sound of words. … Even when I write an opinion, I will often read a sentence aloud and [ask,] ‘Can I say this in fewer words—can I write it so the meaning will come across with greater clarity?’”.
I can’t say I ever saw a connection between Ginsburg’s prose and Nabokov’s. Nabokov was a wonderful stylist, but he was certainly not known for saying things in “fewer words.”
Having read Nabokov’s beautifully controlled and atmospheric memoir Speak, Memory, I recall that his father, whom he highly respected and loved, was a well-known law expert in Russia before the revolution (and I see looking here that his grandfather was involved with law as well, as Justice Minister during the reign of Alexander II).
I don’t have Vladimir Nabokov’s memoir in front of me right now so I can’t quote it. But I remember that, in the wonderful chapter devoted to his father, he praised his father’s ability to write clearly and succinctly in first draft and compared it favorably to his own meanders and convoluted crossings-out while in the act of composition.]
From the Nadezhda Mandelstam chapter of Clive James’ excellent Cultural Amnesia [in the following excerpts I have Americanized the British spelling):
The main difference [between the Gulag and Hitler's Reich] was that in Nazi Europe the victims knew…who they were, and eventually came to know they were doomed. In the Soviet Union, the bourgeois elements could not even be certain that they were marked down for death. Like Kafka’s victims in the Strafkolonie, they were in a perpetual state of trying to imagine what their crime might be. Was it to have read books? Was it to have red hair? Was it (the cruelest form of fear) to have submitted too eagerly? Other versions of the same story came out of China, North Korea, Romania, Albania, Cambodia. The same story came out of the Rome of Tiberius, but the twentieth century gave something new to history when societies nominally dedicated to human betterment created a climate of universal fear. In that respect, the Communist despotisms left even Hitler’s Germany looking like a throwback. Hitler was hell on earth, but at least he never promised heaven: not to his victims, at any rate. It’s the disappointment of what happened in the new Russia that Nadezhda [Mandelstam] captures and distils into an elixir.
A little later in the essay he writes:
Quite early in the regime’s career of permanent house cleaning—certainly no later than Lunacharsky’s crackdown on the avant garde in 1929—anyone stemming from the pre-revolutionary intelligentsia was automatically enrolled along with remnants of the bourgeoisie in the classification of “class enemy.”…Civilized articulacy was as deadly a giveaway as soft hands…Eventually any kind of knowledge that had been acquired under the old order was enough to mark down its possessor. Just as Pol Pot’s teenage myrmidons assailed anyone who wore spectacles, so the Soviet “organs” discovered that even a knowledge of engineering was a threat to state security…Any field of study with its own objective criteria was thought to be inherently subversive. Given time, Stalin probably would have applied the Lysenko principle to every scientific field. To this day, scholars puzzle over the reasons for Stalin’s purging the Red Army of its best generals in the crucial years leading up to June 1941, but the answer might lie close at hand. The fact that military knowledge—strategy, tactics, and logistics—was a field of data and principles verifiable independently of ideology might have been more than enough to invite his hatred. In attacking his own army, of course, Stalin came close to demolishing the whole Soviet enterprise. But at the center of the totalitarian mentality is the fear that the internal enemy might be unapprehended…
[Nadezhda Mandelstam] does believe that there is such a thing as independent moral judgement, a quality in perfect polarity with the regime, which can’t tolerate the existence of independent moral judgement, and indeed has come into being specifically so as to eliminate all such values.
You can see for yourself the relevance to our own times. Moral relativism, the destruction of traditional values, the hegemony of PC thought over facts and knowledge—it’s all there. All but the camps. But are camps even needed, when the control of so many of the institutions is good?
Which doesn’t mean that camps—the natural progression of leftist thought—won’t come some day. But I have long thought we’re headed the Chavez rather than the Soviet way. Or in other words, somewhat more Brave New World than Nineteen Eighty-Four.
[NOTE: For an example of the sort of thinking that's become rife in academia and that dovetails quite nicely with this post, see this:
Harvard student Sandra Y.L. Korn recently proposed in The Harvard Crimson that academics should be stopped if their research is deemed oppressive. Arguing that “academic justice” should replace “academic freedom,” she writes: “If our university community opposes racism, sexism, and heterosexism, why should we put up with research that counters our goals simply in the name of ‘academic freedom’?”
In other words, Korn would have the university cease to be a forum for open debate and free inquiry in the name of justice, as defined by mainstream liberal academia.
Unfortunately, this is already a reality in most universities across America, where academics and university administrators alike are trying, often successfully, to discredit and prohibit certain ideas and ways of thinking. Particularly in the humanities, many ideas are no longer considered legitimate, and debate over them is de facto non-existent. In order to delegitimize researchers who are out of line, academics brand them with one of several terms that have emerged from social science theory.
I wonder whether the self-righteous Ms. Korn is even aware of whose footsteps she's following in.]
…do something about this?
It seems a lot more imminent than global warming. Plus, it’s supposedly a relatively simple and not-too-expensive fix.
Bob Shrum is going for his ninth presidential campaign loss. The first eight were in his capacity as Democratic political consultant. The ninth will be as unofficial advisor to the GOP in 2016.
This long-term partisan and dedicated Democrat is sharing some of his wisdom with Republicans in an article entitled “Why the GOP Needs a Return to the Bush Leagues,” the gist of which is that the Republicans have no good alternatives for 2016 except Jeb Bush. Not because Bush is so great, but because everyone else is so exceedingly dreadful.
Shrum’s piece falls into that particular genre of political writing I think of as “helpful hints from your enemy.” Does he really imagine that anyone in the GOP is listening to him and would take his advice? If so, they’re even dumber than I think they are.
What’s the point of Shrum’s writing such an article? To raise Jeb Bush’s profile in the public mind. To put down all the other possibilities. To bring a smile to Democrat faces.
Nothing much happens to me when I get hungry except that I get hungry.
But some people are different: they get angry and pick fights.
Indeed. I know that quite well because I’ve been closely and even at times intimately involved with people who do just that. It took me many years to realize what was happening. What was all this moodiness about, and these sudden flashes of rage? Hunger didn’t account for all of it, of course, but it certainly was correlated—so much so that I finally learned, after years of puzzlement, to ask a simple question whenever the unprovoked peevishness occurred, “Are you hungry?”
And now I’m happy to learn there’s a word for it: “hangry.” Solution? Eat more often—duh!
[NOTE: This is a repost from Easters past. But it still works for me.]
Happy Easter to all my celebratory Christian readers, and to all those who just enjoy the holiday as well!
One year when my son was little, I spent the week prior to Easter blowing out eggs and dying them. Now that he’s grown and away, the eggs are packed away in boxes and stored in parts unknown. If I could get my hands on them I’d photograph them for you, because even all these years later they are beautiful, with dyes both subtle and unsubtle, interesting etched patterns and rainbow effects—definitely one of my finest crafts hours (to tell the truth, I didn’t have so many fine crafts hours, although there was also a gingerbread house we made that was stored in the attic and alas, eaten by small creatures–and not human ones, at that.)
Blown-out eggs are well worth the trouble, and why? Because they last. And nothing eats them. You only have to make them once, and you’re all set. They are a bit fragile, but not so very.
So here’s my Easter present to you (not that you couldn’t find the information yourself)—some instructions for blowing eggs, from a link that has disappeared since I first wrote this post:
First, you’ll need to make a tiny pin hole on each end of the egg. A pin works well, or a wooden kitchen skewer or even the tip of a sharp knife. Gently work the tip of the pin/skewer/knife in a circular motion until a tiny hole appears. Repeat on the other side. Then insert the pin or skewer (the knife will be too big here) far enough into the egg to break the yolk. Use your mouth [blow] to expel the contents of the egg.
And here is a more complex–but perhaps better–way, for those obsessive-compulsives among us.
These aren’t mine, but they’ll have to do as substitute:
There is only one jelly bean worth eating at Easter or any other time of year.
No, not those weirdly flavored “gourmet” Jelly Bellys (I consider the term “gourmet jelly bean” to be an oxymoron). The traditionalist in me abhors them, despite Reagan’s reported fondness. As for those jelly beans placed on the endless supermarket aisles of Easter treats that tempt us from Valentine’s Day until tomorrow—when the remnants go on sale and those get scarfed up as well—the vast majority should not be consumed by anyone above the age of four. Maybe not even by anyone below the age of four.
What should? I submit these, which are a tad more expensive but probably will not break the bank:
Traditionally fruit-flavored, made with smooth and succulent pectin, with a lovely and slightly translucent sheen, they go down easy. Maybe too easy; it is possible to eat quite a few before realizing what’s happening. Take it from one who knows.
How did jelly beans come to be associated with Easter? It seems a no-brainer because of their egglike shape, but apparently the tradition didn’t really get going until the 1930s. Jelly beans are far older than that, however, making their debut as the confection promoted by Schrafft of Boston for sending to Union soldiers during the Civil War (a crafty man, that Schrafft).
A little-known jelly bean fact (at least to me) is that, “in United States slang in the 1910s and early 1920s a ‘Jelly bean’ or ‘Jellybean’ was a young man who made great efforts to dress very stylishly, presumably to attract women, but had little else to recommend him…The word was also used as a synonym for pimp.”
Returning to the actual candy, I offer a caveat: there is hardly anything worse than the shock of thinking you’re biting into a normal fruit-flavored jelly bean and getting a spicy one. They should be identified by special markings, like those insects that are bad to eat, as a warning to others. I suggest racing stripes.
But if you buy the Russell Stovers, there’s no need to be on the spice alert. And remember: Monday the sales begin! Although, come to think of it, it’s a sign of this particular jelly bean’s superiority that not only are they generally available year-round, but at most stores they are exempted from the post-Easter markdowns. They’re that good.
[NOTE: This is a repost.]
No one should wear this dress. It’s just not possible to compete with it:
Perhaps the only proper use of the dress would be as camouflage in a scene like this:
Here are some cautionary tales about the strength and possible danger of legal marijuana edibles, whether they be sold as medical marijuana or in states like Colorado which choose to legalize general marijuana sales:
Last year, the poison center run by Bronstein received 126 calls concerning adverse reactions to marijuana. So far this year — after pot sales became legal on Jan. 1 — the center has gotten 65 calls. Bronstein attributed the spike to the higher concentrations of THC in marijuana that has become available.
Although millions of Americans have used pot without becoming violent, Bronstein said such behavior is possible depending on the type of hallucinations a user experiences. Toxicologists say genetic makeup, health issues and other factors also can make a difference.
“With these products, everybody is inexperienced,” Bronstein said. “It’s the first time people have been able to buy it in a store. People need to be respectful of these products.”
I’ve witnessed this firsthand, because I know people who use medical marijuana and I’ve observed its incredibly strong effect when ingested. Back in the 60s, marijuana was almost always a lot weaker than today. But apparently, officially sanctioned and state-controlled marijuana can be a lot stronger even than the current crop of illegal marijuana, which was already stronger than in the past:
The two recent deaths [in Colorado] have stoked concerns about Colorado’s recreational marijuana industry and the effects of the drug, especially since cookies, candy and other pot edibles can be exponentially more potent than a joint.
There’s no question that other drugs—including prescription drugs—can and are abused in dangerous ways. The same is true of alcohol, of course. But from what I’ve seen of legal marijuana edibles, they are uniquely positioned to have maximum appeal to children, and also to be unwittingly over-consumed:
Twenty-six people have reported poisonings from marijuana edibles this year, when the center started tracking such exposures. Six were children who swallowed innocent-looking edibles, most of which were in plain sight.
Five of those kids were sent to emergency rooms, and two to hospitals for intensive care, Bronstein said. Children were nauseous and sleepy, and doctors worried about their respiratory systems shutting down…
“One of the problems is people become very impatient,” Bronstein said. “They eat a brownie or a chocolate chip cookie and they get no effect, so then they stack the doses and all the sudden they get an extreme effect that they weren’t expecting.”
Plus, chocolate candy just plain tastes good. The marijuana-laced ones look so innocuous, just like a Lindt truffle. So if one is good, why not a few more? But these things pack a mighty, mighty wallop.
Previously a lifelong Democrat, born in New York and living in New England, surrounded by liberals on all sides, I've found myself slowly but surely leaving the fold and becoming that dread thing: a neocon.
Read More >>