Monday, July 15, 2013

Arguments for the Existence of God: Preliminary Issues

 

The following series is intended to lay out some helpful argument to demonstrate the existence of God. Philosophically constructed arguments as such are not the basis of my belief in Yahweh, the God of the Old and New Testaments, for he’s revealed clearly both in creation and in human nature (cf. Romans 1). The whole world is evidence for his existence. Nevertheless, i’ve attempted to take that evidence and reformulate it into helpful arguments that can be used to bolster the faith of Christians and refute those who contradict.

But before we delve straight into the arguments themselves we should note two things: the centrality of worldviews and the impossibility of neutrality in holding and formulating worldviews.

The importance of worldviews. The notion of worldview is key here. Technically speaking, the term worldview is looser than a philosophy, but the overlap is great. Here’s my working definition:

A worldview is a spoken (or unspoken), consistent (or less consistent), often assumed, though rarely articulated, comprehensive vision of life. Here’s a more philosophical definition, A worldview is a network of guiding assumptions regarding the nature of reality (i.e. metaphysics), knowledge and truth (i.e epistemology), what we should value (i.e. value theory) and how we should live our lives (i.e. ethics).

Here, given the definitions above, we all have a worldview. And, more importantly, we should develop their worldview. Since everyone thinks “worldviewishly,” the least we can do is do it well. Likewise, we should strive to be more self-conscious about our worldview development. Too often- and I’m the first to admit this about myself- we passively soak up bits-and- pieces of the worldview of the surrounding culture.

The problem of neutrality. Since we all have views regarding the most important issues of life (What’s real? How and what do I know? How should I live? What is valuable? ), to deny this is naïve. Now, let me clarify for a second. I’m not saying that we have views on every single thing. Personally, I have no views on string theory, or the status of quarks. So, if someone tries to persuade me of them views on those matters it’s fairly easy.  But talking about a worldview, the lens through which we integrate our entire lives, is something very different. No one is either neutral or objective. I also reject the modernist and enlightenment notion of objectivity. None of us has “God’s eye view” of reality. We’re always firmly planted in our historical contexts, with its biases (whether they’re helpful or harmful), and various ways of seeing things.

Now, one might be tempted to think that I’ve opened the door wide for relativism, but I don’t think that’s the case. When I reject the notion of objectivity, I’m not saying we can’t know truths that exist independently of our options. I do believe we can have such knowledge. What I reject as philosophically naïve is the notion that we can come to weighty matters without concern, without prejudice, and with the ‘cool detachment’ of Reason (notice the capital R).

Monday, April 08, 2013


The Centers for Disease Control and Prevention has sent out a warning to hospitals about a new antibiotic-resistant bacteria, carbapenem-resistant Enterobacteriaceae, or CRE. While this strain of bacteria is not new, it has become more common in the last 10 years or so and has now become prevalent enough to warrant a higher level of concern.

It's worth backing up for a second to discuss what all of this means. We use antibiotics to treat bacterial infections. When we first started developing antibiotics, such infections were easier to cure. But over time, the bacteria evolved. They developed the ability to fight the antibiotics that we use. They pass on this ability to resist treatment to bacteria that follow. Over time, we are often forced to develop new antibiotics to beat infections that were previously treated easily.
 
Hospitals need to take action against the spread of a deadly, antibiotic-resistant strain of bacteria, says the Centers for Disease Control and Prevention. The bacteria kill up to half of patients who are infected.

The bacteria, called carbapenem-resistant Enterobacteriaceae or CRE, have increased over the past decade and grown resistant to even the most powerful antibiotics, according to the CDC. In the first half of 2012, 200 health care facilities treated patients infected with CRE.

"CRE are nightmare bacteria," CDC director Dr. Tom Frieden said in a statement. "Our strongest antibiotics don't work and patients are left with potentially untreatable infections. Doctors, hospital leaders and public health must work together now to implement CDC's 'detect and protect' strategy and stop these infections from spreading."

That strategy includes making sure proper hand hygiene policies in health care facilities are actually followed.

Patients should also be screened for CREs, according to the CDC. Infected patients should be isolated, or grouped together to limit exposures.

The good news is that not only is CRE seen relatively infrequently in most U.S. facilities, but current surveillance systems haven't been able to find it commonly in otherwise healthy people in the community, says Dr. Alex Kallen, a CDC medical officer.

"Of course, if this were to (spread to the community), it would make it much more difficult to control," he said.

Each year, hospital-acquired infections sicken about 1.7 million and kill 99,000 people in the United States. While up to 50% of patients with CRE bloodstream infections die, similar antibiotic-susceptible bacteria kill about 20% of bloodstream-infected patients.

This is what has happened here with CRE. Over time, these bacteria have become harder and harder to treat. The old antibiotics don't work as well. In this case, CRE infections kill about half of patients who have bloodstream infections. This is more than twice as many people who die from similar infections with antibiotic-susceptible strains.
 
Right now, CRE only are of concern to certain susceptible patients in the hospital. It's not common in the community, and most of the warnings are directed at hospitals, imploring them to take precautions to isolate patients and prevent spread in the inpatient setting.

The nightmare scenario, though, is that this bacteria will get out into the community.

This isn't fear-mongering. Years ago, Staphylococcus aureus infections were also relatively easy to treat. Over time, though, a strain of bacteria, known as Methicillin-resistant Staphylococcus aureus, or MRSA, became a problem in hospitals. The CDC issued warnings to hospitals to take precautions to prevent its spread. Over time, though, it got out into the community.

A 2008 study of children who came into an emergency department with skin abscesses, or infections, found that about 75% of them were caused by MRSA. Luckily, we still have medications, such as trimethoprim/sulfamethoxazole, to treat these infections. When that fails, though, things will become even more concerning.

Put another way, when I was training, we would have almost never considered MRSA as the cause of a skin infection. These days, though, we pretty much assume it's the cause, and treat with stronger drugs.

Most people believe that the injudicious use of antibiotics is to blame for these developments. Every time we use antibiotics, we give bacteria a chance to evolve. We kill off those susceptible to the drugs and leave those that have developed resistance. Each time we use antibiotics unnecessarily, say to treat a virus, we make the problem worse. Each time we use them improperly, or for too short a period of time, we do the same. These days, we're putting them in everything, from soap, to lotion, to the food that animals eat.

This is a real public health issue. Creating more resistant strains is a serious long-term problem. The new warning is panicking a lot of people, but for the wrong reasons. You're very, very unlikely to get a CRE infection anytime in the near future. It's important that hospitals work to prevent that problem from getting worse, but almost everyone reading about it this week will be unaffected by it.

It's much, much more likely, though, that these same people will ask for antibiotics when they get a cold. That's the kind of thing that will lead to future problems. That's the kind of thing we need to stop now.

Editor's note: Dr. Aaron E. Carroll is an associate professor of pediatrics at the Indiana University School of Medicine and the director of the university's Center for Health Policy and Professionalism Research. He blogs about health policy at The Incidental Economist and tweets at @aaronecarroll.

  Aaron E. Carroll 
Aaron E. Carroll

Saturday, March 23, 2013

Earth Hour Is a Colossal Waste of Time—and Energy

 http://www.slate.com/content/dam/slate/articles/health_and_science/project_syndicate/2013/03/earth_hour_is_all_wrong_we_need_more_electricity_not_less/142198193.jpg.CROP.rectangle3-large.jpg

Plus it ignores how electricity has been a boon for humanity.

On the evening of March 23, 1.3 billion people will go without light at 8:30—and at 9:30, and at 10:30, and for the rest of the night—just like every other night of the year. With no access to electricity, darkness after sunset is a constant reality for these people.

At the same time, another 1 billion people will participate in “Earth Hour” by turning off their lights from 8:30-9:30.

The organizers say that they are providing a way to demonstrate one’s desire to “do something” about global warming. But the reality is that Earth Hour teaches all the wrong lessens, and it actually increases CO2 emissions. Its vain symbolism reveals exactly what is wrong with today’s feel-good environmentalism.

Earth Hour teaches us that tackling global warming is easy. Yet, by switching off the lights, all we are doing is making it harder to see.

Notice that you have not been asked to switch off anything really inconvenient, like your heating or air-conditioning, television, computer, mobile phone, or any of the myriad technologies that depend on affordable, plentiful energy electricity and make modern life possible. If switching off the lights for one hour per year really were beneficial, why would we not do it for the other 8,759?

Hypothetically, switching off the lights for an hour would cut CO2 emissions from power plants around the world. But, even if everyone in the entire world cut all residential lighting, and this translated entirely into CO2 reduction, it would be the equivalent of China pausing its CO2 emissions for less than four minutes. In fact, Earth Hour will cause emissions to increase.

As the United Kingdom’s National Grid operators have found, a small decline in electricity consumption does not translate into less energy being pumped into the grid, and therefore will not reduce emissions. Moreover, during Earth Hour, any significant drop in electricity demand will entail a reduction in CO2 emissions during the hour, but it will be offset by the surge from firing up coal or gas stations to restore electricity supplies afterward.

And the cozy candles that many participants will light, which seem so natural and environmentally friendly, are still fossil fuels—and almost 100 times less efficient than incandescent light bulbs. Using one candle for each switched-off bulb cancels out even the theoretical CO2 reduction; using two candles means that you emit more CO2.

Electricity has given humanity huge benefits. Almost 3 billion people still burn dung, twigs, and other traditional fuels indoors to cook and keep warm, generating noxious fumes that kill an estimated 2 million people each year, mostly women and children. Likewise, just 100 years ago, the average American family spent six hours each week during cold months shoveling six tons of coal into the furnace (not to mention cleaning the coal dust from carpets, furniture, curtains, and bedclothes). In the developed world today, electric stoves and heaters have banished indoor air pollution.

Similarly, electricity has allowed us to mechanize much of our world, ending most backbreaking work. The washing machine liberated women from spending endless hours carrying water and beating clothing on scrub boards. The refrigerator made it possible for almost everyone to eat more fruits and vegetables, and to stop eating rotten food, which is the main reason why the most prevalent cancer for men in the United States in 1930, stomach cancer, is the least prevalent now.

Electricity has allowed us to irrigate fields and synthesize fertilizer from air. The light that it powers has enabled us to have active, productive lives past sunset. The electricity that people in rich countries consume is, on average, equivalent to the energy of 56 servants helping them. Even people in Sub-Saharan Africa have electricity equivalent to about three servants. They need more of it, not less.

This is relevant not only for the world’s poor. Because of rising energy prices from green subsidies, 800,000 German households can no longer pay their electricity bills. In the United Kingdom, there are now more than 5 million fuel-poor people, and the country’s electricity regulator now publicly worries that environmental targets could lead to blackouts in less than nine months.

Today, we produce only a small fraction of the energy that we need from solar and wind—0.7 percent from wind and just 0.1 percent from solar. These technologies currently are too expensive. They are also unreliable (we still have no idea what to do when the wind is not blowing). Even with optimistic assumptions, the International Energy Agency estimates that, by 2035, we will produce just 2.4 percent of our energy from wind and 0.8 percent from solar.

To green the world’s energy, we should abandon the old-fashioned policy of subsidizing unreliable solar and wind—a policy that has failed for 20 years, and that will fail for the next 22. Instead, we should focus on inventing new, more efficient green technologies to outcompete fossil fuels.
If we really want a sustainable future for all of humanity and our planet, we shouldn’t plunge ourselves back into darkness. Tackling climate change by turning off the lights and eating dinner by candlelight smacks of the “let them eat cake” approach to the world’s problems that appeals only to well-electrified, comfortable elites.

Focusing on green R&D might not feel as good as participating in a global gabfest with flashlights and good intentions, but it is a much brighter idea.

Monday, March 04, 2013

Smartphone Bacteria


If you're a squeamish clean freak, then you may feel like not using your smartphone after reading this. It may look sleek and shiny, but your smartphone is actually crawling with bacteria and germs. While some of these bacteria may be harmless, some can actually be harmful to your body. 

A professor from the University of Surrey, Dr Simon Park, had his students imprint their smartphones onto petri dishes which can aid bacteria growth. They then let the bacteria grow for a few days. Here, most of the bacterial growth is clustered at the top and bottom of the smartphone. The petri dishes were blooming with various kinds of bacteria and fungi afterwards. You can trace the shape of the smartphone via the outline of the bacteria. 

According to Dr Park, such bacteria is also a record of your personal and cleanliness habits. "It harbors a history of our personal and physical contacts such as other people, soil," he wrote on his blog. Some even bring their smartphone to the washroom, which enhances growth of coliforms, when they place their smartphones on the various surfaces of the washroom!

To clean your smartphone properly, first switch off the phone completely or remove the battery. Then use special screen cleaners and a dry cloth to wipe down the phone. If you don't want to spend so much on cleaners, you can use a diluted mix of water and alcohol. 

Whatever it is, do not dunk the entire phone into water. You can also use sticky tape to remove all the grit and dust stuck between keypads or in tiny nooks and crannies. Wait for the phone to dry entirely before switching it on, or inserting the battery back in.