TidBITS: iMac G5: Up In Smoke
Apple has two problems with their iMacs - unreliable power supplies and capacitor failures. Both are described in detail by an experienced Mac journalist who had to replace his iMac. The worse news is that the iMac's automated diagnostics came up with the wrong answer -- and Apple's customer services is now pretty darned awful.
I suspect the new iMacs will be safe to buy, but I wouldn't touch any of the existing models. Shame on Apple for following the stonewall strategy; shame that it usually works.
Monday, April 25, 2005
Building a police state, one day at a time
Salon.com | A "volunteer" police state
A group of non-Bush supporters are prevented from joining a Bush social security event by "secret service agents" -- apparently because of a "no blood for oil" bumper sticker. The event turns out to have been publicly funded. The agents may, or may not, have been officially 'secret service'.
This was standard operating procedure during Bush campaign events. There must never be any critics near Bush.
It's not surprising, but for the sake of historians it's good to document these things.
A group of non-Bush supporters are prevented from joining a Bush social security event by "secret service agents" -- apparently because of a "no blood for oil" bumper sticker. The event turns out to have been publicly funded. The agents may, or may not, have been officially 'secret service'.
This was standard operating procedure during Bush campaign events. There must never be any critics near Bush.
It's not surprising, but for the sake of historians it's good to document these things.
Identify theft: a summary of recent cases
ID theft is inescapable | Channel Register
A fairly good summary of recent identity theft related crimes. The deluge is coming. Ahh well, it's our fault for electing dolts to govern us.
Since government isn't going to help us, we need some kind of a Libertarian solution. I wish I could think of one.
A fairly good summary of recent identity theft related crimes. The deluge is coming. Ahh well, it's our fault for electing dolts to govern us.
Since government isn't going to help us, we need some kind of a Libertarian solution. I wish I could think of one.
Sunday, April 24, 2005
Great awakenings: Number Four in 2015?
Second Great Awakening - Wikipedia, the free encyclopedia
One popular theory on great awakenings is that they occur during times of profound transformation. They're a kind of counter-revolution, though the 2nd GA was thought to have mutated so as to eventually support rather than oppose the transformations of the 1830s.
The 1830s were shockingly turbulent times. Across the US the Euro invasion was in full force, with new lands being overrun daily. Communities were very transient, growing and vanishing with a speed seen only in China today. Clerics went from 30 year tenures to a mere 4 year visit. The industrial revolution was gearing up; steam engines, agricultural transformation, the cotton gin, new weapons. Darwin was doing his basic research (though he'd not yet published). The American economy was going through boom and bust.
So it's easy, in retrospect, to see why religious ecstacy swept through various regions, leaving behind burned out zones of exhausted spirituality. It took about 30 years for the 2nd GA to run its course -- smack into the civil war.
So are we in a 4th GA? That's the great question. Perhaps our times are not yet turbulent enough; but it is likely they will get there. Given our current burst of fundamentalism, and given the rough waters ahead, I think a 4th GA is quite likely in the next two decades. Perhaps it's time to resign oneself to President Frist ...
One popular theory on great awakenings is that they occur during times of profound transformation. They're a kind of counter-revolution, though the 2nd GA was thought to have mutated so as to eventually support rather than oppose the transformations of the 1830s.
The 1830s were shockingly turbulent times. Across the US the Euro invasion was in full force, with new lands being overrun daily. Communities were very transient, growing and vanishing with a speed seen only in China today. Clerics went from 30 year tenures to a mere 4 year visit. The industrial revolution was gearing up; steam engines, agricultural transformation, the cotton gin, new weapons. Darwin was doing his basic research (though he'd not yet published). The American economy was going through boom and bust.
So it's easy, in retrospect, to see why religious ecstacy swept through various regions, leaving behind burned out zones of exhausted spirituality. It took about 30 years for the 2nd GA to run its course -- smack into the civil war.
So are we in a 4th GA? That's the great question. Perhaps our times are not yet turbulent enough; but it is likely they will get there. Given our current burst of fundamentalism, and given the rough waters ahead, I think a 4th GA is quite likely in the next two decades. Perhaps it's time to resign oneself to President Frist ...
A thoughtful discussion of millenial American fundamentalism
NewDonkey.com: The Tribulations of "Revelations"
There's too much dense thinking here to excerpt. This is simply a very thoughtful and nuanced discussion of early 21st century American fundamentalist millenialism. His thesis is that the movement has been coopted by a peculiar secular agenda. I'm not sure the two are so distinct; the concept of God's blessing being fundamentally material is older than the Book of Job and not all areligious.
There's too much dense thinking here to excerpt. This is simply a very thoughtful and nuanced discussion of early 21st century American fundamentalist millenialism. His thesis is that the movement has been coopted by a peculiar secular agenda. I'm not sure the two are so distinct; the concept of God's blessing being fundamentally material is older than the Book of Job and not all areligious.
You too have Alzheimer's Process
Entrez PubMed: Search related to stress and mild cognitive impairment
Middle-aged? Had trouble remembering the name of that troublesome Senator? Oh yes ... Frist.
Sound familiar? Of course it's worse when you're low on sleep, stressed, etc. Really, it's quite normal. Your friends and colleagues, with a few remarkable exceptions, are similarly afflicted. Exceptionally productive individuals, with great contributions ahead of them, have trouble with distraction and focus in middle-age. They'll use date books, calendars, task lists and more -- things they didn't used to need. Use Google to find missing words and to remind one of things. Avoid learning new confusing and complex software. Lose track of documents (full text search is handy). Put the umbrella on the door handle, put the car keys on the item that must not be left behind.
Adaptations.
It doesn't help, of course, that middle-aged life can be exceedingly full. Complex family needs, complex parental obligations, complex finances, lots of stress, increasing responsibility at work as one nears the Peter Principal (what's that word, ahh, this one will do) asymptote, a brain overloaded with memory, experience, associations.
Understandable. Normal. Typical. So you can stop reading now. (The following is slightly speculative. I find it oddly reassuring, but most won't.)
You should stop reading because, when you pin an Alzheimer's specialist or researcher into a dark corner, and beat them with a rubber hose (actually, a beer works too), they'll confess that it's probably Alzheimers (omit the apostraphe, it's a pain). Not the disease really, because a disease suggests something exceptional or atypical. This process is universal and lifelong, starting probably in the teenage years. What we call Alzheimer's "Disease" is probably "normal" aging of the brain. Just as we know aging rates vary with stress (telomeres shorten with stress) and genetics, so too does brain aging likely vary with stress, injury (concussion especially) and various unidentified environmental impacts.
The timing of symptomatic manifestations (i.e. disease), from forgetfullness to dementia, is determined by a combination of fundamental reserve (how much does one start with at age 19 [1]), biological brain age, and adaptive techniques. Some people become work disabled in their fifties, a luck few will be able to carry out their routine daily life readily until they've over 100. Most of us will be substantially impaired in our late 70s, and will be work-impaired in our 60s.
So, now that you understand this, (Phew ... I won't get Alzheimer's! I already have it ...) how does should you use knowledge? (I told you not read this far.)
Well, we probably ought to talk about it. Unless we can slow the fundamental aging of the brain (there's hope, actually) we shouldn't expect to raise the retirement age much. As the boomers age, we may also want to consider environmental adaptations in the workplace, such as simpler and more reliable software, better search tools, no instant messaging (sorry young-uns), less email, fewer interruptions, schedule brainless meetings in the afternoon only (when middle-aged brains are sluggish anyway) ...
For the truly tough (life is not for wimps), it might be useful to (I suggest doing this privately) track one's decline and plot out the point when one will have to switch careers or make other accommodations. There's a business opportunity here for a web site that would provide anonymous yearly testing and maintain a projection curve. It will be a while before most of us are ready for this step. Denial is not a bad thing.
Oh, and maybe we'll finally get a useable and reliable PDA -- one made to fulfill Engelbart's dream of the cognitive extender (aka crutch).
Meanwhile, be supportive of Alzheimer's Disease research. It's not just for your parents ...
[1] If the Alzheimer's process is global, but initial reserves are assymetric, then one would expect symptomatic disease to by likewise assymetric. So if one starts with strong visual-spatial reasoning but weak categorical memory, the memory will be the first to go. One would also expect persistent strengths to compensate for focal deficits, just as they do in developing brains.
Middle-aged? Had trouble remembering the name of that troublesome Senator? Oh yes ... Frist.
Sound familiar? Of course it's worse when you're low on sleep, stressed, etc. Really, it's quite normal. Your friends and colleagues, with a few remarkable exceptions, are similarly afflicted. Exceptionally productive individuals, with great contributions ahead of them, have trouble with distraction and focus in middle-age. They'll use date books, calendars, task lists and more -- things they didn't used to need. Use Google to find missing words and to remind one of things. Avoid learning new confusing and complex software. Lose track of documents (full text search is handy). Put the umbrella on the door handle, put the car keys on the item that must not be left behind.
Adaptations.
It doesn't help, of course, that middle-aged life can be exceedingly full. Complex family needs, complex parental obligations, complex finances, lots of stress, increasing responsibility at work as one nears the Peter Principal (what's that word, ahh, this one will do) asymptote, a brain overloaded with memory, experience, associations.
Understandable. Normal. Typical. So you can stop reading now. (The following is slightly speculative. I find it oddly reassuring, but most won't.)
You should stop reading because, when you pin an Alzheimer's specialist or researcher into a dark corner, and beat them with a rubber hose (actually, a beer works too), they'll confess that it's probably Alzheimers (omit the apostraphe, it's a pain). Not the disease really, because a disease suggests something exceptional or atypical. This process is universal and lifelong, starting probably in the teenage years. What we call Alzheimer's "Disease" is probably "normal" aging of the brain. Just as we know aging rates vary with stress (telomeres shorten with stress) and genetics, so too does brain aging likely vary with stress, injury (concussion especially) and various unidentified environmental impacts.
The timing of symptomatic manifestations (i.e. disease), from forgetfullness to dementia, is determined by a combination of fundamental reserve (how much does one start with at age 19 [1]), biological brain age, and adaptive techniques. Some people become work disabled in their fifties, a luck few will be able to carry out their routine daily life readily until they've over 100. Most of us will be substantially impaired in our late 70s, and will be work-impaired in our 60s.
So, now that you understand this, (Phew ... I won't get Alzheimer's! I already have it ...) how does should you use knowledge? (I told you not read this far.)
Well, we probably ought to talk about it. Unless we can slow the fundamental aging of the brain (there's hope, actually) we shouldn't expect to raise the retirement age much. As the boomers age, we may also want to consider environmental adaptations in the workplace, such as simpler and more reliable software, better search tools, no instant messaging (sorry young-uns), less email, fewer interruptions, schedule brainless meetings in the afternoon only (when middle-aged brains are sluggish anyway) ...
For the truly tough (life is not for wimps), it might be useful to (I suggest doing this privately) track one's decline and plot out the point when one will have to switch careers or make other accommodations. There's a business opportunity here for a web site that would provide anonymous yearly testing and maintain a projection curve. It will be a while before most of us are ready for this step. Denial is not a bad thing.
Oh, and maybe we'll finally get a useable and reliable PDA -- one made to fulfill Engelbart's dream of the cognitive extender (aka crutch).
Meanwhile, be supportive of Alzheimer's Disease research. It's not just for your parents ...
[1] If the Alzheimer's process is global, but initial reserves are assymetric, then one would expect symptomatic disease to by likewise assymetric. So if one starts with strong visual-spatial reasoning but weak categorical memory, the memory will be the first to go. One would also expect persistent strengths to compensate for focal deficits, just as they do in developing brains.
Orrin Hatch - the arational defender of steroid abuse
The New York Times > Opinion > Editorial: Muscle Flexing in Congress
Update 4/24: After writing this, another angle occurred to me. Orrin Hatch is not a young man. He believes DHEA is an "anti-aging" drug and he believes it is harmless. It is thus rather likely that Hatch uses DHEA himself. Since most quality physicians would consider this a harmful act, Hatch may qualify as a steroid abuser. In particular, he may have become dependent on the mildly euphoric qualities of oral steroids. He's unlikely to approve legislation that may lead to limited access to his drug of choice.
By all accounts, Senator Orrin Hatch, the Utah Republican who is a proven power player in defending his home state's diet supplement industry, managed to keep DHEA on the shelves of nutrition centers. DHEA has been widely banned for Olympic and professional athletics. But an investigative report by The Times detailed the supplement's survival in the marketplace - even though it metabolizes into testosterone products banned under the law.Hatch has been very consistent about this sort of thing for over 20 years. I don't think he's particularly corrupt, he's a simple man with some persistent delusions. Oh, and he has a lot of power. Blame it on Utah.
Senator Hatch defends DHEA as a special case, as an 'anti-aging pill' that deserves to be legal. Capitol Hill negotiators who saw no such virtue complain that Senator Hatch baldly threatened to block the entire steroid control proposal unless DHEA was exempted.
DHEA's Washington lobbyists happen to be the senator's son and a former longtime staff aide to Mr. Hatch. Asked whether he'd been lobbied by his son, Senator Hatch replied, 'Not that I know of.' Actually, the senator is such an aggressive defender of the supplement industry that lobbying him is a redundancy. Lawmakers should look to their bench in embarrassment and reconsider the exemption.
Update 4/24: After writing this, another angle occurred to me. Orrin Hatch is not a young man. He believes DHEA is an "anti-aging" drug and he believes it is harmless. It is thus rather likely that Hatch uses DHEA himself. Since most quality physicians would consider this a harmful act, Hatch may qualify as a steroid abuser. In particular, he may have become dependent on the mildly euphoric qualities of oral steroids. He's unlikely to approve legislation that may lead to limited access to his drug of choice.
Enlightened hedonism: tsunami donation by tourism
The New York Times > Travel > After the Tsunami, Rebuilding Paradise
Want to help countries recover from a natural disaster? Be a noble hedonist.
Want to help countries recover from a natural disaster? Be a noble hedonist.
...The hardships of the Thai people seemed to be on the minds of visitors who sat in the lounge chairs along the [Puket] beach.Enlightened hedonism is the best donation. On the other hand, earthquake risk will remain elevated until there's silence for a time (Bayesian analysis). Hmmm.
'That's the reason we came now,' said Gordon Brind, 51, who was there in late March on vacation with his family from Britain. 'We were here last year and we decided to come again after the tsunami. Everyone was donating in the U.K. to tsunami funds, and in other countries, too, I'm sure. But the main part of it, really, is that they must have work to live.'
The Bush budget's "sunset commission"
RollingStone.com: Bush's Most Radical Plan Yet
The Bush method relies upon a supine or dysfunctional media, a media more focused on crowd pleasing side-shows than on radical transformations of government. By hook or by crook (or both), Bush has the media he needs. This administration has mastered the fundamental art of magic -- distracting with the right hand while the left hand does the real work.
Well, the ten people who read this blog, and the thirty that read Rolling Stone, now know. I doubt any voted for Bush.
If you've got something to hide in Washington, the best place to bury it is in the federal budget. The spending plan that President Bush submitted to Congress this year contains 2,000 pages that outline funding to safeguard the environment, protect workers from injury and death, crack down on securities fraud and ensure the safety of prescription drugs. But almost unnoticed in the budget, tucked away in a single paragraph, is a provision that could make every one of those protections a thing of the past.Bush did something similar in Texas. The commissions were made up of people opposed to the agencies that regulated them; astonishingly they eliminated their enemies. The annoying thing is no so much that Bush wants to return America to the pre (Teddy) Roosevelt era, but rather that he's so sneaky and underhanded about how he operates.
The proposal, spelled out in three short sentences, would give the president the power to appoint an eight-member panel called the "Sunset Commission," which would systematically review federal programs every ten years and decide whether they should be eliminated. Any programs that are not "producing results," in the eyes of the commission, would "automatically terminate unless the Congress took action to continue them."
The Bush method relies upon a supine or dysfunctional media, a media more focused on crowd pleasing side-shows than on radical transformations of government. By hook or by crook (or both), Bush has the media he needs. This administration has mastered the fundamental art of magic -- distracting with the right hand while the left hand does the real work.
Well, the ten people who read this blog, and the thirty that read Rolling Stone, now know. I doubt any voted for Bush.
Saturday, April 23, 2005
Uplifting the dog
Print: The Chronicle: 4/15/2005: Clever Canines
With the remarkable exception of a Vernor Vinge novel, dog-like critters don't get much play in science fiction. Dogs don't get no respect. So when David Brin wrote the "uplift" series about cognitively enhanced cetaceans, he didn't mention dogs. Meanwhile, in the real world, canines are being "uplifted"...
I've only known one dog very well. I never got the feeling that she was sentient in the same sort of way I think I am, but she was certainly a person.
With the remarkable exception of a Vernor Vinge novel, dog-like critters don't get much play in science fiction. Dogs don't get no respect. So when David Brin wrote the "uplift" series about cognitively enhanced cetaceans, he didn't mention dogs. Meanwhile, in the real world, canines are being "uplifted"...
Clever CaninesI've written about this before. The canine is a very interesting genus; remarkably adaptive to its host (us). We may best understand ourselves by better understanding them. Sometime I must write about my not- entirely-in-fun theory that dogs created civilization by allowing women and geeks to defend themselves against the alpha male.
Did domestication make dogs smarter?
By COLIN WOODARD
Budapest, Hungary
Vilmos Csányi's department has literally gone to the dogs. Canines have the run of the place, greeting visitors in the hall, checking up on faculty members in their offices, or cavorting with one another in classrooms overlooking the Danube River, six floors below.
And, not infrequently, they go to work in the laboratories, where Mr. Csányi and his colleagues are trying to determine just how much canine brains are capable of...
... Mr. Csányi's team has been studying canine cognition for the past decade and, in the process, has built a body of experimental evidence that suggests dogs have far greater mental capabilities than scientists have previously given them credit for. "Our experiments indicate a high level of social understanding in dogs," he says.
In their relationship with humans, dogs have developed remarkable interspecies-communications skills, says Mr. Csányi. "They easily accept a membership in the family, they can predict social events, they provide and request information, obey rules of conduct, and are able to cooperate and imitate human actions," he says. His research even suggests that dogs can speculate on what we are thinking.
The latest findings to come out of the department suggest that dogs' barks have evolved into a relatively sophisticated way of communicating with humans. Adam Miklósi, an ethology professor, set out in a recent experiment to see if humans can interpret what dogs mean when they bark. He recruited 90 human volunteers and played them 21 recordings of barking Hungarian mudis, a herding breed.
The recordings captured dogs in seven situations, such as playing with other dogs, anticipating food, and encountering an intruder. The people showed strong agreement about the emotional meaning of the various barks, regardless of whether they owned a mudi or another breed of dog, or had never owned a dog. Owners and nonowners were also equally successful at deducing the situation that had elicited the barks, guessing correctly in a third of the situations, or about double the rate of chance.
... dogs' interest in communicating with humans to solve problems appeared to be innate, probably an evolutionary byproduct of their domestication, says Mr. Csányi...
I've only known one dog very well. I never got the feeling that she was sentient in the same sort of way I think I am, but she was certainly a person.
An interesting overview of Cyc and an update on the AI agenda
New Scientist Whatever happened to machines that think? - Features
If you believe humans think (debateable, interestingly), and if you believe humans don't contain supernatural elements (souls [1]), then humans are biological thinking machines. Hence thinking machines. Since humans routinely create humans, we can create thinking machines.
So the interesting question becomes, can humans create non-human thinking machines, perhaps a mixture of the biologic and abiologic?
I bet yes. But, just as with Peak Oil, I can't say when. Probably within 100 years.
I tend to think it will be a very bad thing for my grandchildren, but that's just a hunch. I hope it won't be a very bad thing for my children. If I thought a 2nd Christian/Muslim Fundamentalist Dark Ages would delay this development, I might be a Bush supporter.
Alas, the competitive advantages of thinking machines are so great I can't imagine anything short of the annihilation of all human civilizations everywhere significantly delaying their appearance. That is 'destroying the village in order to save it' -- so I don't support the Bush/bin Laden agenda.
[1] Philosophical arguments against "strong AI", such as Searle's "Chinese Room", are essentially arguments for the existence of the soul, and thus for the existence of a deity. So "strong AI" debates, like the Fermi Paradox are "big question" topics.
If you believe humans think (debateable, interestingly), and if you believe humans don't contain supernatural elements (souls [1]), then humans are biological thinking machines. Hence thinking machines. Since humans routinely create humans, we can create thinking machines.
So the interesting question becomes, can humans create non-human thinking machines, perhaps a mixture of the biologic and abiologic?
I bet yes. But, just as with Peak Oil, I can't say when. Probably within 100 years.
I tend to think it will be a very bad thing for my grandchildren, but that's just a hunch. I hope it won't be a very bad thing for my children. If I thought a 2nd Christian/Muslim Fundamentalist Dark Ages would delay this development, I might be a Bush supporter.
Alas, the competitive advantages of thinking machines are so great I can't imagine anything short of the annihilation of all human civilizations everywhere significantly delaying their appearance. That is 'destroying the village in order to save it' -- so I don't support the Bush/bin Laden agenda.
[1] Philosophical arguments against "strong AI", such as Searle's "Chinese Room", are essentially arguments for the existence of the soul, and thus for the existence of a deity. So "strong AI" debates, like the Fermi Paradox are "big question" topics.
MetaFilter "peak oil" update
April 22: Earth Day or Peak Oil Day? | MetaFilter
A good update on where we are with respect to Peak Oil. The Simmons link, including this one, is quite good (btw, he incidentally notes what the ANWAR debate is really about -- not the publicly stated expectations, but rather the dream/nightmare that very large reserves will be found, the extraction of which would likely have devastating local consequences).
I've been a "Peak Oil" guy since I did my environment engineering stuides at Caltech in 1980. The question is, of course, not "if", but rather "when", and "how" the transition will be made. Back then we thought the 1990s or so, but it looks like it will be somewhere between 2005 and 2020. I'm cautiously optimistic that rich countries would cope; I fear for the less rich nations. Of course in the post-9/11 world it ought to be obvious that if Peak Oil causes social disruption in poor nations, that the rich will not escape unscathed.
Now if that odd research on fusion-in-a-bottle works out ...
Update 5/4/05: The April 30, 2005 issue of The Economist did a review of the Oil industry, and specifically addressed the Peak Oil question. They are oil optimists They don't actually make a prediction about when Peak Oil will occur; but on is left with the impression that The Economist thinks oil will go out of fashion before a maximal production level is reached. Certainly nothing before 2030! So those looking for anti-Peak Oil ammunition have a readable resource for their arguments.
A good update on where we are with respect to Peak Oil. The Simmons link, including this one, is quite good (btw, he incidentally notes what the ANWAR debate is really about -- not the publicly stated expectations, but rather the dream/nightmare that very large reserves will be found, the extraction of which would likely have devastating local consequences).
I've been a "Peak Oil" guy since I did my environment engineering stuides at Caltech in 1980. The question is, of course, not "if", but rather "when", and "how" the transition will be made. Back then we thought the 1990s or so, but it looks like it will be somewhere between 2005 and 2020. I'm cautiously optimistic that rich countries would cope; I fear for the less rich nations. Of course in the post-9/11 world it ought to be obvious that if Peak Oil causes social disruption in poor nations, that the rich will not escape unscathed.
Now if that odd research on fusion-in-a-bottle works out ...
Update 5/4/05: The April 30, 2005 issue of The Economist did a review of the Oil industry, and specifically addressed the Peak Oil question. They are oil optimists They don't actually make a prediction about when Peak Oil will occur; but on is left with the impression that The Economist thinks oil will go out of fashion before a maximal production level is reached. Certainly nothing before 2030! So those looking for anti-Peak Oil ammunition have a readable resource for their arguments.
How will American "Christians" respond?
. . . Smearing Christian Judges (washingtonpost.com)
To some extent Gaston oversimplifies -- as do all editorialists. The Catholic church is America is not particularly right-wing, but a significant segment is effectively aligning with the descendants of their historic enemies (not the first time such odd alliances have formed!). There's no mention of the Mormons, but they have an uneasy theological relationship with Catholics and Protestants alike.
Reading the full article, by the way, it seems Gaston divides Americans into secular and Christian. I'm sure he knows better, but it's a common pattern. Sigh.
Americans have, unfortunately, already made up their minds. The 2004 elections were hard fought, only someone in deep denial could have imagined that a vote for Bush was not a vote for right wing religious fundamentalism. Bush won a majority; our nation voted for the fundamentalists. It's a bit late for Bush voters to be having regrets. We who resist are a minority, we fight a rear-guard battle of delaying tactics hoping that the majority will change their mind in 2006 and 2008. If Frist is president in 2008 we will truly become a Christian theocratic state; after which the Catholic-fundamentalist Protestant alliance will disintegrate in a very ugly way.
...The present war within the Christian fold is perhaps more threatening to the republic than any of the previous intramural disputes. Right-wing religious zealots, working in partnership with the secularists who have advised President Bush, are a threat to the most fundamental of American principles. The founders of our nation welcomed and planned for spirited debate over public policies, including the role of the judiciary. But as sons of the Enlightenment, they looked to found a republic in which the outcome of those debates would turn on reason and evidence, not on disputed religious dogma. They planned wisely for principles that are now under wide assault.I emphasized "enlightenment". (BTW, by "intramural" I'm sure he speaks in terms of religion, I hope he doesn't think we're facing the challenges of the 1830s. Or maybe he does ...)
All Americans, of whatever religious or non-religious persuasion, need to be on the alert to preserve those principles. The burden falls especially heavily on the mainstream Christians who are slowly awakening to the gravity of the challenge facing them. Too long tolerant of their brethren, too much given to forgiveness rather than to confrontation, they need to mount a spirited, nationwide response to what constitutes a dangerous distortion of Christian truths and a frightening threat to the republic they love.
To some extent Gaston oversimplifies -- as do all editorialists. The Catholic church is America is not particularly right-wing, but a significant segment is effectively aligning with the descendants of their historic enemies (not the first time such odd alliances have formed!). There's no mention of the Mormons, but they have an uneasy theological relationship with Catholics and Protestants alike.
Reading the full article, by the way, it seems Gaston divides Americans into secular and Christian. I'm sure he knows better, but it's a common pattern. Sigh.
Americans have, unfortunately, already made up their minds. The 2004 elections were hard fought, only someone in deep denial could have imagined that a vote for Bush was not a vote for right wing religious fundamentalism. Bush won a majority; our nation voted for the fundamentalists. It's a bit late for Bush voters to be having regrets. We who resist are a minority, we fight a rear-guard battle of delaying tactics hoping that the majority will change their mind in 2006 and 2008. If Frist is president in 2008 we will truly become a Christian theocratic state; after which the Catholic-fundamentalist Protestant alliance will disintegrate in a very ugly way.
Has Bill Gates become a fundamentalist?
The New York Times > National > Microsoft Comes Under Fire for Reversal on Gay Rights Bill
Hmm. First Gates supports Bush over Clinton in the 2000 elections -- thought then it seemed to be over the Fed's antitrust win, which Bush pulled after he won. More recently the Gates Foundation supports the creationist agenda. Now Microsoft retreats from a Gay Rights Bill.
Yes, it could all be commercial calculation -- except the creationist funding. That changes the equation a bit. I wonder if either Gates and/or Ballmer have undergone a mid-life conversion of sorts.
Hmm. First Gates supports Bush over Clinton in the 2000 elections -- thought then it seemed to be over the Fed's antitrust win, which Bush pulled after he won. More recently the Gates Foundation supports the creationist agenda. Now Microsoft retreats from a Gay Rights Bill.
Yes, it could all be commercial calculation -- except the creationist funding. That changes the equation a bit. I wonder if either Gates and/or Ballmer have undergone a mid-life conversion of sorts.
The slow advancement of ambulatory medical practice
Medical Notes: TOC
Eons ago, when I created the personal notes I'm linking to above, I was a real physician (GP basically, usually called an FP nowadays). Now I work in healthcare IT, but I continue to do the CME (continuing medical education) required to retain my licensure -- though I wouldn't see real patients without some supervised retraining. It's been more than 10 years since I was a country doc.
I've just completed two days of the Minnesota Academy of Family Practice's CME program.As I reflect on my notes, it's easiest for me to compare what I read to what I practiced in 1994 (I've done lots of CME since then, but it's in day to day practice that book learning becomes knowledge). I'm struck by one remarkable observation -- things haven't changed all that much.
I'm not talking about out-of-date docs practicing the medicine of their residency days. The faculty are state-of-the-art academics, including many subspecialists, all introducing the latest best practices. It is however true that a course like this only samples a few subjects. We didn't cover HIV management for example -- an area in which there's been great change. Like oncology, most HIV management is really a specialists domain. Where FPs care for cancer and HIV patients, we are usually working someone else's plan.
Some of the more remarkable changes in the past 10 years, are, in fact, retreats. We used to know how to manage the menopause (ERT), now we really don't. (Testosterone is popular now for female sexual dysfunction -- tell me that won't come to grief.) We used to try hard to identify reversible dementia -- no-one talks much about that any more. Alzheimer's is part of aging -- we are all touched by it in some measure, there are no effective preventive interventions, no good treatments (yet), just good management approaches. PSA used to be wonderful, now it seemed to a bit gauche.
The major breakthrough, compared to 1994, was in the management of erectile dysfunction. That's a pleasure to hear about (I'm not being ironic, it's great to be able to do something about this age-old problem), but there wasn't much else in the same league. Type II diabetes management is finally catching up to what many of us suspected 10 years ago (insulin is a double-edged sword), but the changes are not revolutionary. The preventive cardiologist and endocrinologist want everyone on statins, but there's still some nervousness in the audience about effects on neuronal membranes. Sure -- Lipitor for the diabetic or the patient with known heart disease -- but do want ever American male with a waist over 40" on high dose Lipitor?
Given my lack of practice and my aging brain, I suspect that 1994 JF with 1994 knowledge would do better on today's exams than I could. Yes, stroke management is somewhat changed and the old antibiotics don't work so well (be afraid -- they don't have easy replacements), but much of day to day healthcare seems to be changing more slowly than most people imagine. One big change was the source of much complaining -- noone likes their medical record computer system very much.
Eons ago, when I created the personal notes I'm linking to above, I was a real physician (GP basically, usually called an FP nowadays). Now I work in healthcare IT, but I continue to do the CME (continuing medical education) required to retain my licensure -- though I wouldn't see real patients without some supervised retraining. It's been more than 10 years since I was a country doc.
I've just completed two days of the Minnesota Academy of Family Practice's CME program.As I reflect on my notes, it's easiest for me to compare what I read to what I practiced in 1994 (I've done lots of CME since then, but it's in day to day practice that book learning becomes knowledge). I'm struck by one remarkable observation -- things haven't changed all that much.
I'm not talking about out-of-date docs practicing the medicine of their residency days. The faculty are state-of-the-art academics, including many subspecialists, all introducing the latest best practices. It is however true that a course like this only samples a few subjects. We didn't cover HIV management for example -- an area in which there's been great change. Like oncology, most HIV management is really a specialists domain. Where FPs care for cancer and HIV patients, we are usually working someone else's plan.
Some of the more remarkable changes in the past 10 years, are, in fact, retreats. We used to know how to manage the menopause (ERT), now we really don't. (Testosterone is popular now for female sexual dysfunction -- tell me that won't come to grief.) We used to try hard to identify reversible dementia -- no-one talks much about that any more. Alzheimer's is part of aging -- we are all touched by it in some measure, there are no effective preventive interventions, no good treatments (yet), just good management approaches. PSA used to be wonderful, now it seemed to a bit gauche.
The major breakthrough, compared to 1994, was in the management of erectile dysfunction. That's a pleasure to hear about (I'm not being ironic, it's great to be able to do something about this age-old problem), but there wasn't much else in the same league. Type II diabetes management is finally catching up to what many of us suspected 10 years ago (insulin is a double-edged sword), but the changes are not revolutionary. The preventive cardiologist and endocrinologist want everyone on statins, but there's still some nervousness in the audience about effects on neuronal membranes. Sure -- Lipitor for the diabetic or the patient with known heart disease -- but do want ever American male with a waist over 40" on high dose Lipitor?
Given my lack of practice and my aging brain, I suspect that 1994 JF with 1994 knowledge would do better on today's exams than I could. Yes, stroke management is somewhat changed and the old antibiotics don't work so well (be afraid -- they don't have easy replacements), but much of day to day healthcare seems to be changing more slowly than most people imagine. One big change was the source of much complaining -- noone likes their medical record computer system very much.
Subscribe to:
Posts (Atom)