Friday, December 27, 2013

Closing Pascal’s Box

          Nearly four centuries ago Blasie Pascal laid out his famous wager in an attempt to prove that it was rational to believe in God. The argument seems deceptively simple, either God exists or not, we have to “place a bet,” so to speak, and the only way we “win” the wager is if we bet for God, and God exists. Hidden in this attempt to justify theism is a rather complex use of probability and decision theories, voluntarism, pragmatism, and an often overlooked use of infinity.1 This argument had implications far beyond the Philosophy of Religion, and set the stage for the continuing debate on epistemic justification, how it is rational for us to form and hold beliefs. In effect, he opened Pandora's box on this topic.
          In this paper I will show that some of the last openings for Pascal's mode of thinking have been closed off in recent years. While it is clear that there is a great number of things that can be said about his argument, I will focus on voluntarism, forming beliefs at will, and pragmatic justifications, basing beliefs on non-epistemic concerns. With these two features alone, we can remove any doubt that rational people can be epistemically justified in being convinced by the wager, and we can close Pascal's box.

Thursday, December 19, 2013

Proposed Research Project

The following is a research proposal done as part of a class.  I may never do this project, and while my claims here are supported by the evidence that I found, this is far from a completed project, and so can't be taken as a solid argument.

Discussion of the Topic:
          The topic of this research project is the complex and seemingly contradictory histories of the Constitutionality of religious practices inside the United States Government. While the primary study will be on the federal level, some important state cases, like the 1927 Scopes trial, may also be examined. It may seem that until the latter half of the 20th Century there is a tendency for government to endorse religious beliefs and practices (Lewis 2002, 78-80). While that is compounded by the ceremonial and patriotic governmental mentions of “God” (Newdow v. Rio Linda 2010, 3877), it is a mistake to believe that the Constitution allowed any level of government to endorse religion, or prefer religion over non-religious beliefs (Schauer 1996, 444).

Wednesday, December 18, 2013

Moral Pluralism and George's Job Search

          In this paper I will briefly lay out W. D. Ross' ethical theory, Moral Pluralism (MP), and apply it to Bernard Williams' “George the chemist” case in order to demonstrate how such a theory can resolve moral conflicts. Moral Pluralism differs from what could be called “moral monism” in that monism claims that there is a single principle that serves as an explanation of morality,1 but pluralism claims there is a collection of principles where none of them are more basic or fundamental than the rest.2 The major departure between pluralism and monism is that there is no single justification of morality, the parts of morality, or the plurality of the moral rules; the basic moral rules exist together, they cannot be derived from one another, and they are not grounded on some external principle.3
          This may seem less plausible than other theories because determinations in specific cases are usually deduced from a combination of the basic principle and the derived duties. Here, we have only a collection of duties with no principle to make use of in deciding cases. Ross did not find this problematic because he posited “prima facie” duties (PFD), a collection of basic moral duties, and what could be called “all-things-considered” duties (ATCD), the duties that are left after careful reflection on “one's duty proper … [or] one's actual duty.”4 Imagine a driver of a car on a snowy freeway. The driver has a large number of legal duties that apply to her at any given moment: the duty to keep the car in working order, to use turn signals before changing lanes, to drive safe speeds, and to keep her vehicle under her control in all weather conditions. Failing at any of one of these might count as being legally blameworthy. If an emergency vehicle should appear behind her with full lights and sirens, all the PFDs of driving are still there, but her ATCD becomes to move out of the way.

Wednesday, December 4, 2013

A Response to "Gay New York: Gender, Urban Culture, and the Gay Male World 1890-1940"

          George Chauncey's goal in Gay New York: Gender, Urban Culture, and the Gay Male World 1890-1940, was to bring into the public consciousness a forgotten segment of American History, the history of a culture that was “not supposed to have existed”.1 He confronts three widely held misconceptions of the homosexual culture in pre-World War II New York City, which form the modern conception until the latter half of the 20th Century; homosexuals lived in isolation from one another, were invisible to the public sphere, and that they internalized a self-loathing and other negative attitudes from the mainstream culture.2 His task is to show evidence that none of those were universally true.
          The means with which Chauncey sets about this task is to give the reader a glimpse into the “gay world” were modern readers might expect to find “the gay closet”.3 The gay world was a loosely connected web of social networks, each separable and distinct from one another,4 creating their own common folklore, unique linguistic style, and establishing their own (long running) cultural festivals.5 In drawing the map of the “sexual topography” of NYC, he leaves behind the familiar lines that now separate heterosexual from homosexual, by illustrating the extent, uses, and intermixed character of both the physical and social spaces at the time.6

Tuesday, November 19, 2013

Jim's Predicament

          This paper compares Rule Consequentialism (RC) and Kantian ethics by examining their application to Bernard Williams' “Jim the botanist” thought experiment, a difficult moral case. This is to explore the relative strengths and weaknesses of these theories. By briefly showing the key differences in the approaches, it should become evident that, although both are flawed, the two rule-based systems are not equally capable of producing moral determinations.

          Jim is a foreigner captured by a government that has issues with the natives protesting them. In an attempt to quell the protests, the government has rounded up 20 random natives they plan to execute. Since it is apparently rare that a foreigner would be there, Jim is given the opportunity to save 19 lives, but he must personally kill one. He has no reason to think that any of the natives are guilty of a capital offense, but not killing one of them will result in all 20 of them dying. This forced choice is meant to demonstrate that there may be times which we think that it is acceptable to violate an absolute prohibition in order to prevent additional violations.1

Saturday, October 19, 2013

Folk Moral Objectivism

The following paper is a response to "Folk Moral Relativism".


     The article “Folk Moral Relativism” attempts to use empirical means to show that previous studies, which concluded that most were moral objectivists, arrived at that conclusion by looking at the same culture as the respondents, and that by expanding the study to include other cultures they hoped to show that people hold a relativistic view of morality. However, while their studies may show that there is a common tendency to view that cultures hold different moral standards, the inequality of justifications of standards suggests objectivism is the folk norm.

     Before examining the article, it would be useful to understand the meaning of Moral Relativism (MR), and Moral Objectivism (MO) in this context. MR claims that the correctness of any given moral action needs to be evaluated in the context of a culture, sometimes resulting in contradictory conclusions being correct.1 MO claims that there is only one truth about morality in a similar manner as there is only one truth about empirical claims.2 If there are two competing notions of the rightness of a given action, one “is surely mistaken”.3

     Previous studies show that a majority of people hold MO to be correct, but this study set out to demonstrate that the findings were skewed by methodology, and that the truth was far more complicated.4 By pointing out that there are external facts which have a bearing on the truth of any claim, like the seasons being relative to the hemisphere, considering moral claims with reference to other cultures leads to relativistic conclusions; the more extreme the difference in cultures, the greater likelihood neither stance is viewed as wrong.5

     The first study demonstrated this by surveying students from Baruch College, New York City, on the moral correctness of two actions, killing a child based on appearance, and testing the sharpness of a knife by random stabbings. Conflicting opinions were reported of judges from wildly different cultures, two from their own culture (specifically one from their own immediate collective), one from an isolated warrior tribe, and one from an alien culture.6 The results showed that the closer to their own culture, the more likely the students would say at least one of the conflicting opinions was wrong, but that is less true as people think about different cultures.7

Friday, September 6, 2013

Morality, Paperwork, and Exposing the Corruption of Power

     There is a fundamental flaw when most people talk about morality. The common assumption is that there are only two things to think about when talking about morality, a binary of right and wrong, good and evil. There are at least three main categories of actions (even in religious moral thinking): obligations, restrictions, optional, and two sub-categorizes under optional, preferred, and (for this post I'll say) discouraged. There is one additional category, the supererogatory, which covers all actions that are optional, but go beyond obligation (“above and beyond the call of duty”). There are three main things to consider about morality: the right (right, wrong, and optional actions), the good (what makes an action good or bad), and moral value (the value of a person, thing, or action).

     A good action is good because it is an obligation. A wrong action is wrong because it is restricted. Anything that is neither an obligation or restricted is optional, but given the weight placed upon the moral value, optional actions may be preferred or discouraged. In fact, the value that we place on any given action may swing it out of the right or wrong category. Lying is wrong, but lying to save 1,200 Jews from the Holocaust is the very reason we consider Oskar Schindler a good person. His supererogatory act of lying to his own government and his own political party (for which he had been a spy), compounded again and again, was wrong, but the moral value that we place on life means that lying with the intent to save even one life, let alone 1,200, means that the lying becomes a least a prefered option, if not an obligation. 

Sunday, August 11, 2013

Budget concerns: Science verses Military R&D

The following was my side of a Facebook conversation that I have slightly edited into a single document. It was in response to a graph that compared the budget of the military to researchers and the 50-year NASA budget. (Click here to see the graph.)

Veteran Affairs

While I do agree that the combined Science budgets are ridiculously small and the Military spending is obscenely large, Veteran Affairs/Services should never be lumped in with the Military Budget. It directly benefits the civilian population mainly by providing health care to Veterans that have service related injuries. Those Veterans would otherwise represent a major, continuing drain on any purely civilian insurance and health care provider. That of course, is if the Veterans can even get coverage (which generally they can't) or will ever be able to afford that health care given the average drop in lifetime earning of 10% that is associated with Military Service.

It also deceptively includes the GI Bill. The money moves through the VA's hands, but it didn't come from the government. The GI Bill is a financed primarily by Active Duty Service Members that pay/buy into the system. It is made sustainable by the fact that most Veterans never collect. The GI Bill held the distinction of being the only Government fund to continually have a surplus, regardless of the economy. The Post-9/11 GI Bill has sped up the payout, but it is still a mostly self-sustaining system.

Thursday, June 20, 2013

Entrepreneurship

(This paper was written for a UMUC class I took in 2006, and is presented in that state, for whatever it is worth today.)

            While it may be true that life is what you make of it, finding out just what you want to make of your life can be the greatest challenge that a person could face.  For those like myself who go down to the bay and see "the tall ships," settling down and marking out one course seems like a foreign idea; however, at some point skills have to be accounted for and life experiences mustered so that one clear path can be found to take you over the next horizon.  Plotting my course through the rest of my life is full of many options and many interesting ports to drop anchor in for a while.  One such option continues to resurface time after time:  owning my own entrepreneurial enterprise.
            In this paper I hope to show how drive, determination, education and risk-taking can lead to a fulfilling life as a small business owner and entrepreneur.  I will examine what traits an entrepreneur must have, what major challenges small business owners will face, and how small businesses can use "global thinking."

Sunday, June 9, 2013

Embedded Journalists

(The following is a speech that I gave in 2010 in a college class.  Most of the audience was 18-25 years old. although there were several non-traditional students as well.)

          The topic I’m going to talk with you about might seem a bit dated. It was really brought to the public consciousness when most of you were around 12-15 years old, so it is something that many college-aged people basically grew up with. What I’m talking about is embedded journalists. To give everyone a little perspective, when I was twelve I watched the nightly news for months as Operation Desert Shield built up steam. I was awed by photos of the line of ships, one every mile from Norfolk, Virginia to the Middle East. 

          When it became Desert Storm, I watched in wonderment as our journalists, our war correspondents picked apart every aspect of the operation they could get their hands on. Burned into my mind are the images of familiar faces sitting at impromptu news desks with the darkening twilight sky highlighting the flare of flack and the streams of anti-aircraft fire. I can remember the first time I saw a video of a missile flying through a window or a bomb dropping down a vent-shaft. 12 years before that some of you might remember being glued to the TV watching the Iran Hostage crisis, the Energy crisis, or maybe the Soviet’s invasion of Afghanistan.

Monday, June 3, 2013

Culture: An Evolutionary Tale

          While the development of culture was essential to forming familial-like bonds that lead to the creation of state-level societies, adaptability of the ruling body was the single biggest determinant factor in the continuation of any ancient empire.

          Adaptability of societal structures has driven the progress of cultural evolution. From the harsh Paleolithic planes of Africa to the height of Roman civilization, the ability of a culture to adapt to both external and internal changes played a major role in its survivability. Unlike other social species, the human ruling class directs societies, at least in part. Out of self-interest and special levels of social intelligence found only in great apes, the leading individuals have guided the great cultures into cataclysmic battles for not only their own survival, but also the very culture they shape. Through the horrors of war, human societies found either survival in glorious victory, or they disappeared into the pages of antiquity.

          The duality of culture and warfare seem at odds, although they often walk the same path. Cultural views, like those of some religions, sometimes condemn violence and the dogs of war starve themselves without cultural backing and a leash. Still, the greatest pinnacles of human progress have come at the end of both sword and pen. However, the greatest falls came from either the tip of a hired blade or a sip of venomous wine.

          Before the great ancient states fell, they first caused numerous other, lesser states to fall. The heads of those states failed to see the approaching storm and devise a stratagem to save their own necks. So, it may be tempting to conclude that adaptability in warfare is the paramount factor in the survival of a society, but without careful examination, that conclusion would be premature.

Monday, May 27, 2013

A Long, Cold Walk

          I once took a very long walk in Japan on one cold winter's evening. The clouds slumped low over the rooftops of the small towns tucked away in the tight little valleys ringed by slumbering, leafless trees. I was trying to go somewhere for the first time, by a train route I had never seen, which was far enough away from the cold, rainy Tokyo sprawl that little by little English characters that wayward Western travelers used as beacons of hope in a kanji world had all but disappeared.

          As the train cut through the crisp air the saggy clouds gradually revealed the a skyline of gnarled branches had completely replaced the semi-urban one most gaijin only see and come to think of as all Japan is. With a melodic hum, the train came to rest at a stop that the modern world had forgotten. A dim, unshielded bulb flickered on and off as the gray clouds gathered here and dispersed there, tricking the electronic eye into thinking the dying day had sighed its last breath. 

          The old wooden station sign's paint had long since started peeling away making the shallow shadow cast by the confused light bulb the only means to distinguish the nearly-ancient characters. Was it my stop or not? I couldn't tell, but the crackling speaker began to chirp out the happy little local melody that told me I had about 10 seconds to decide. 

          In Japan, it is usually easier to wait for the next train than it is to try to catch one back to a station once passed. So I stepped out. As the electric song of the train's engines sounded out the departure of the heated safety of civilization, the cold wind whipped my face red in a matter of seconds and sucked my warm breath up and away in a swirling mass destined to find its own way in the monochromatic sky. 

          "When you get to the station, use the west exit and turn left at the first road. Keep walking until you come to a Kōban (police box) and turn right. Walk for about five minutes and you'll be here," were the directions I had been given over the phone, confirmed, and them memorized several hours before, and I dutifully followed them. But instead of a warm home to enjoy a pleasant evening, beyond the kōban I found endless snow-covered fields ringed by slumbering, leafless trees.

The Distant Shores of Memory

          In a land on far-distant shores, in a place the locals call Nippon, lays a small rocky beach sprinkled with black sand dancing in salty pools. In the short distance where the endless horizons of the ocean kiss the ground, a tiny fishing village, untouched by the passing of time, sits just out of reach of the quickly marching masses of millions of Japanese men and woman in Tokyo, always moving to the future.

          Standing on the brine-covered beach, a time-weathered fisherman quietly hums an old festival tune to the soft purring of his long-time angling partner, Neiko, a gold and white striped tabby with fur matted by the pungent tailing of his last meal. With a flick of the wrist and a swish of the line, the old fisherman casts off the rough rope securing his small boat keel-skyward in the black sand. As Neiko jumps clear with a decidedly annoyed yowl, the man feels the dry wood of the hull in his hands and the planks smoothed by years of use, but deep in his heart the man knows that the boat, like himself, isn't quite ready for the fire-pit.

          The grinding of sand on wood temporarily drowns out the rhythmic cracking of the waves on the rocks as the man hauls the ship knee deep into the water warmed by the noon-day sun over the shallow bay. The deep scraping in turn gives way the slow lapping of water and wood, a sound of perfect harmony for those who go down to the sea.

          Through the foggy glass of memory, the man and his cat could see deep into his own past, watching his father and his father before him set sail on the fertile waters. He knew from the generational stories he heard from his mother’s lap, this was the way of his family, his people, and from a time long since forgotten, his countrymen as well. Despite the rapidly changing world around his tiny beach, far from the reach of millions of Tokyo men and women always marching to the future, Neiko, the old fisherman and his boat remembered the old ways from ages past.

Monday, May 20, 2013

Smoke and Water

Fading Memories by James Zike
Written June 2008 

         First watch… Always the first watch in the middle of the night, in the middle of the ocean with nowhere to go and almost nothing to do. Well that’s a lie; there was a huge amount of work to do but not anything fun. At least the stark fluorescent lights help all the pasty night-checkers forget the nothingness of the ocean at night just outside every hatch and doorway. Such was my life on board the aircraft carrier USS Kitty Hawk.

          She was a fine ship back when the shipbuilders laid her keel, but forty odd years later, time and the tides had taken their toll on her. Rust hung over the haze gray hull like streamers around every vent-port and catwalk. Tendrils of corrosion slowly crept their way over her once impressive face, but that’s the way of these things, I suppose. One day she’s a proud and able icon of American strength and determination, the next a relic of ages overstaying their welcome.

          Of course, back in spring of 1999 no one could have even imagined that the passing of two other icons would keep the Hawk around. I certainly wouldn't have guessed it. In fact, I would have guaranteed the Hawk would not be cruising back to port, but I'd be going home with wet feet by night’s end. However, at mid-night, affectionately called “balls” by most sailors thanks to the “00:00” time entry on the deck log, I was still more concerned with playing paperboy.

Saturday, May 18, 2013

Truth Until Relativity: Is Cultural Relativism Applicable Today?

Note to the reader: this paper was written in 2009 and is not a very philosophic review of Cultural Relativity. Nonetheless, I think it is interesting enough to revisit.

Philosopher: When is truth false?
Anthropologist: When the belief of truth is relative.

          The 19th century anthropological theory of Cultural Relativity has swayed a great deal of thought and research, despite its sweeping philosophical implications. The idea that truth is relative flies in the face of philosophical reason, but it still may provide a neutral viewpoint for anthropologists studying cultures that do not share common morals. As humanity enters the 21st century, the great debate between the two disciplines continues, and even after 200 years of deliberation, Cultural Relativism may have a valid place in academia today if restated in such a way to resolve its philosophical issues.

          In an article for The Stanford Encyclopedia of Philosophy, Chris Swoyer states that Relativism is a group of theories that include a base notion that many aspects of truth-bearing statements are dependent concepts “relative to language, culture, or biological makeup” (Swoyer). Swoyer goes on to say culture may significantly alter the way a person perceives and interprets information.

Tuesday, May 14, 2013

Cause and Effect: Humean Doubts and Kantian Answers


          To understand David Hume's criticism of the idea that we can know, in a robust and philosophical way, that there is a connection between what we call causes and effects, we must first examine how he thought our minds related to the world. Unlike the Rationalists that came before him, Hume was skeptical that reason and intuition were all that we needed for knowledge. For Hume, the first contact that we have with any object (if it exists at all) is the appearance the object has on our senses, so that the first thing that we are aware of is an impression that we have.1 Humean impressions are not simply limited to our sensual perceptions of the potential objects around us, but are also of every possible thing that we might experience, including our own internal mental processes, like emotion-states and first-order desires. Impressions are not just what we see, feel, taste, etc., but how we feel, what we want, and what motivates us. In short, impressions are the way that we first experience everything.

          From those impressions, content is directly added into the mind and forms ideas that share the same content. Hume thought this was a matter of common sense, anyone could see that while reflecting upon the painful experience of touching a burning hot object, we almost feel the same pain, but with less force than if we were actually touching something hot.2 To Hume, the impression had a strength to it that could never be matched by a mere idea, but impressions only differed from ideas in strength; the content was copied directly into the idea exactly as it was in the impression.3

          Once a series of impressions has formed a series of ideas, the imagination tends to form perceived connections between the ideas, and those connections can be evaluated according to any number of relationships they bear to each other.4 The comparative work is not a function of perception, but of imagination, and as such, it cannot involve working with impressions, but only with ideas. Of all possible relations, Hume thought that they fell into seven broad groups: resemblance, identity, space and time, quantity, quality, contrariety, and cause and effect.5 From the comparison of ideas, the imagination then sorts the ideas in a way that allows us to make sense of the impressions we receive.

Friday, May 10, 2013

The Breath of the Stoic God


     To readers from Christianized cultures, the claim that the divine is a physical being which directly, causally interacts with the matter of the world can seem odd. Church traditions are drawn from Jewish beliefs, mixed with Platonic and Aristotelian conceptions of the divine being immaterial. It might cause those familiar with that line of thought to see the Stoic conceptions as weaker than the self-existing, eternalsupremely good being, separate from and independent of the world, all-powerful, all-knowing, … creator of the universe,” which became the monotheistic God.1 However, the pantheistic conception of the Stoic God aligns with the Stoic's strict materialism, completely avoiding the host of issues packed into supernatural concepts, and is strengthened by the appeal that God is a part of the physical processes of the universe—a direct link in the causal chain of events.

     One would be remiss to not point out that the philosophic reason for seeking the truth behind the concept of the divine was different in the ancient world. Modern theists attempt to show that their concept of God is a necessary part of the world through a series of arguments meant to demonstrate and justify their conclusion. This was not the goal of ancient religious philosophers. Each school sought its own first principle, the explanatory force that caused the empirically observable world around them (with at least one exception of the Epicureans, who did not associate the first principle with the divine). They did not start with a religious book and then try to justify that position, but rather the trend was to start from the functioning world around them and ask the question, “what sort of thing could have caused this world?”

     While this decidedly teleological approach is similar to William Paley's watch, it does not mean to show that the universe is, or is like a purpose-built machine, constructed one cog or gear at a time by a great watchmaker.2 The Stoics did not presuppose a conception of the divine as modern design arguments tend to do, but started with the four elements, a common belief in all schools of philosophy at the time, and sought to show how the world that actually is could become organized using them. Diogenes Laertius traced out the process where the two principles in the universe, the active and the passive, combined in a “seminal fluid” in the form of “water via air” in a way that reorganized matter into the four elements, which he called the “spermatic principle of the cosmos”.3 Laertius uses this principle along with his concept of God, Zeus, mind and fate being the same, part of the active principle, to explain how the world becomes organized, like a biological process in which the active changes the passive into a new form like itself, which he calls “an animal, rational and alive and intelligent … in the sense that it is a substance which is alive and capable of sense-perception”.4

Thursday, May 9, 2013

The Heart of Men and the Direction of God

“The PREPARATIONS of the heart in man, and the answer of the tongue, is from the Lord. … The Lord had made all things for himself: yea, even the wicked for the day of evil. ... A man's heart deviseth his way: but the Lord directeth his steps.”1

There may be no more enduring philosophic problem than that of human free will. We seem to think that we are ultimately morally responsible for what we do, and yet by means of excuse we can find reasons where one might not be responsible. Some of those reasons seem to be so strong that they are either taken as Gospel or appear to be metaphysical facts of the universe.

It might be a fools errand to try to fix a date to the beginning of the free will debate, but surely the threads of the debate can be found in ancient texts, both religious and philosophic. For instance, Plato held that the will of a person comes from the rational portion of the three-part soul, and is properly used to keep base desires in check.2 Essentially, in this view, as long as reason governed desires, a person was acting freely.

Wednesday, May 8, 2013

Forgetting The Dead


(A note to the reader: this paper was written from sources that were made available in class, and as such, not all of the sources are publicly accessible, and I did not have full information to construct a proper bibliography.  Where possible, I've tried to include links to comparable sources.)

fallen autumn leaves 
memories of days that passed 
wait for winter snow 
(Lindsay Zike, personal collection, 2013)

While no event at any point in history is so fully documented as to know every possible fact, the twentieth century introduced several key innovations that changed the way people interact with memory and history. Photo albums, rolls of film, video collections, and scrapbooks around the world hold a greater number of clues to personal and collective memories than at any previous age. Using those kinds of primary sources, a group of University of Illinois Historians and students spent a year “explor[ing] 'the fate of the twentieth century'” by casting a wide-net that pulled in the perspective of the famous victors and the defeated poor with no special regard for race or political associations, which culminated in Imagining the Twentieth Century, a “frankly unauthoritative history” (Charles C. Stewart and Peter Fritzsche, pg. viii). Despite the broad scope of the project, the photographs and essays that dug deep into likely forgotten collective memories still missed several major events.

Thursday, March 21, 2013

On Fictions


By James Zike
8 March 2013

A Treatise of Human Nature
or
Project Gutenburg ebook
As modern readers, we often bring with us a large amount of intellectual baggage that tends to cloud our judgments.  At times, it can be difficult to separate modern usage and meanings of words from historic works that use the terms in ways that we are not accustomed to any longer.  When David Hume uses the term “fiction”, it may be tempting to import our concept of fiction as fake, false, or untrue, into a work that does not support that usage. 
Hume demonstrated that certain common understandings about objects in the world are not correct and amount to nothing more than “fictional” accounts, but with his ontological statements, he did not intend to answer any metaphysical questions.  That is to say, that Hume’s use of the term “fiction” did not imply falsity or impossibility.  A careful understanding of how he organized his system of knowledge based on empirical means might place his fictions back into the realm of metaphysical possibilities in a way that preserves how he used them throughout his works.

Thursday, February 28, 2013

The Memory of Fear: Scarcity and the Consumer Culture


            The fear of scarcity during the Great Depression gave rise to diverse social changes that drove American progress in new directions.  A child growing up in the middle of the Depression may not have been aware of the hardships and difficulties their own parents faced, which would have seemed much harsher given that the memories of the Roaring 20s and the pre-Great War economic boom were still fresh in the adult minds.  Events that adults use to mark the passage of their lives are not as easily accessible to children because “general and historical conceptions play only a secondary role [in memory]: they actually presuppose the prior and autonomous existence of the personal memory”, and a child’s personal memories are limited in both scope and meanings of events (Maurice Halbwachs, "Historical Memory and Collective Memory", pg. 58-59).

A Field for Three

Standing in a Wisconsin field, 12 year-old Vera Whaley (later Zike) likely had no point of reference to compare the general suffering in 1937, while she posed for a photo clutching what would become her most prized possessions, two dolls named Maize and Patsy.  Her beaming smile tells a story that almost seems to conflict with the well-trodden path of the historic narrative; her picture speaks of the happiness of a child.  “Parents and children each have their own interests,” and the world of adults is to a child “an unknown land” (Halbwachs, pg. 62). 
            However, the adult world of the time doubtlessly caught up with her parents, as economic hardships forced her father, a horse trainer, to move the family from her birth-state of Iowa, and later drift multiple times between Wisconsin and Michigan in the 1930s in the search for work—a common Depression-era story.  Had she not had access to “ready-made reference points” from history and collective memories external to her own perceptions, she would only have recalled the entirety of the Depression in the child-like terms and “images of lesser events”, (Halbwachs, pg. 58-59).  Personal memories that she relayed to her own children were the dissatisfaction of moving to new towns with new schools, forced to leave old friends behind, but one could hardly say that she remained unaware of the greater historic significance because she had reference points to fix her personal memories to collective and historical memories.