Monthly Archives: March 2014



It’s strange to think that I have had a million thoughts endlessly provoking me for months, which prompted me to write “Mediations” just to get them out of my head, and now, with a few posts under my belt, I’ve got nothing. NOTHING! And so, I go back to a basic fundamental principal of writing and begin just writing something…anything. Free form, I believe, is the technical term. It’s not working.

And then, my eldest daughter of fourteen walked in the door and, without a word, reminded me about legacy. You see, she is preparing to go to prom. I’m not ready. How on earth did she grow up so fast? It seems as if it were just yesterday I was putting frilly bonnets upon her head or that, when I tucked her in for the night, she told me all of her “secrets” for the day. I am no longer privy to these things, but they were tender and sweet moments I cherish, especially when she still looks at me with those same eyes every now and again.

It was this strong emotion and difficulty in realizing that I must, for now, let that little girl go and embrace the strong, independent woman she is becoming that reminded me of a conversation I once had with someone else very close to me about legacy, what it means to me and how it relates right back to my children—all of them (I have three).

Presently, the simplest, most watered down definition of the term legacy relates to a monetary gift or bequest of personal property or something that someone, from the past, left behind or some action they committed. In my mind the two differentiations are quite similar; however, it doesn’t compel me to look deeper into their differences or discuss their similarities because, frankly, the latter is more in line with my thought in this regard, mainly because this is how we, as a society, seem to define it; usually, on a grander scale, however.

Most of us, including me, will never do anything so great as to leave a large enough legacy behind for anyone to stand up and take notice. But, sometimes, it’s as simple and naked as an unvarnished rocking chair—my great-grandmother’s for example.

My mother rocked me in it as a baby, I climbed on it, and played with it (dolls often were given time outs in it, especially from my little sister who often told them to “sit up correctly”). It was solid wood with extremely warn varnish and made a soft, quiet, and soothing creak as it rocked. The arms were so loose they often came unhinged. And, it wasn’t particularly comfortable to sit in for long periods of time. The chair was taken apart, sanded down, fixed and stained a much prettier, darker, cherry color at some point and eventually was passed down to me. I rocked my first child in it until it rocked no more. Five generations connected by one chair until we finally had to put it out of its misery.

On a broader scale, the legacy of our great-grandparents wasn’t just in a chair or some piece of furniture they passed down, it was in the actions they took in everyday life that, as a consequence, gave us a life free from the tedious labor they endured with little to no complaint (so I’m told—I surely don’t know this firsthand).

They were immigrants who started their lives in this country as pioneers and then farmers, many of the wives were teachers, and, in some cases, others later became merchants. As the generations matured, we became political scientists, lawyers, professors, and even politicians. Further up the line we became activists, designers, musicians, writers and yet, still more teachers. But, I, and plenty of our generation, never had to work nearly as hard, learn the difference in various chemicals to combat rampant weeds or plant, plow and harvest a field, or in the case of one set of my great-grand parents, light up and watch a cash crop burn.

Don’t get me wrong. I am NOT saying there is anything wrong with being a farmer. It is good, honest, hard work, even more so now with the whole “farm to fork” movement (in my humble opinion). What I am saying is that I am grateful for the ones who came before me who worked the land and taught the children who left a legacy profound enough that I do not have to do physical labor. Because, frankly, I am not built for or cut out to do hard, physical labor. The point being that even in the smallest advancement of future generations a legacy if left behind, whether good or bad, even if we are not the Martin Luther Kings of the world and whether or not we actively choose to acknowledge it—it’s there and their existence is evidenced in any progressive succession we make.

I often wonder, however, if it was what they intended all along, or whether they just did what they had to do in that moment and era to survive in that instant that indirectly and consequentially contributed to forward progression, or at the very least provided more options to choose from for the next generation? Did they have time to ponder—to dream—where their children or their children’s children would end up over a choice made twenty, thirty, or fifty or more years ago?

My quest for leaving some type of legacy has been met with intent purpose since I was quite young. It began in my early teens and I suspect was the product of having dealt with death and dying. But, as in Jerry McGuire, “let’s not tell each other our sad stories.” At about the same age my eldest daughter is now, I became very interested in who my parents were when they were “young,” because, let’s face it, when you’re a kid, your parents are always old, that is until you look back at their lives and see photographs of their past. In addition to seeing photographs, I also wanted to hear stories about the things they did (i.e. how they got into trouble) and what they imagined.

Knowing I was interested in such things, even at such a young age, I figured that someday my own children would be curious about my life and so I began collecting evidence of and chronicling my youth in photographs. I even have a few pieces of clothing from the past that I now wonder why I bothered to hold on to—they are no longer fashionable and aren’t even retro chic, they’re just old, and are collecting dust, but, once again, as is customary with me, I digress.

As I matured and began to have children of my own, legacy began to morph a bit. Rather than just having tangible evidence preserving my youth, it became more about building character (including my own), providing a better life for them and sharing whatever wisdom I had gained so they could be better people and live a better life than I. It, of course, was also a little bit about leaving some type of bequest or personal property. That is, a decent piece of real estate they can live in (something with character, and a freaken built-in-buffet okay!), to take care of and enjoy free and clear (that is, to essentially live rent free) and to pass it on down the line (we’ll see how far we get—this is getting harder and harder to come by these days).

Ultimately, it is in the actions taken to preserve such a gift that your legacy is born and it is preserved in the stories that are told because, while you will eventually be dead and gone, your spirit will live on in those tales shared within the confines of those walls or about that one tangible piece of something that holds a memory for someone.

While I am certainly no saint and have nothing particularly great or profound to offer this world on a grander scale, I will leave this existence one day knowing that what little I could do with the cards life has dealt me was met with a greater purpose and intent—for my children (and maybe a scandalous-type tale or two to tell to the grandkids one day).

There you have it—my take on legacy, a piece of it anyway. Sleep tight.



My goal was never too spend much time writing about music or things related to it and I don’t usually pay much attention to the WP daily prompts; however, one caught my eye as it posed a very intriguing question I could not resist. Oddly enough, I had just posted something in a “music” category—coincidence?

The question was this: “What sort of music was played in your house when you were growing up?” Instantaneously it brought a memory to the forefront that I will never forget. To the best of my recollection I was between the ages of three and five.

In hindsight, my mother had a pretty decent record collection but one particular record had the misfortune of spending too many days in the hot trunk of a car on a summer day. Despite being warped, it still played flawlessly, the needle still followed along the grooves, never skipping a beat, round and round it went, slowly moving up and down in line with the wave of the warp.

Listening to that record invoked an awe in me. The music was about as rock-n-roll as it could get, to a child that age, watching the record wobble and wave as the table turned was mesmerizing, and I was captivated by the album cover too. I stared at it, poured my eyes over every detail and was fascinated by and a little disturbed at the idea a man would walk across, what I perceived to be, a hot blacktop barefoot.

The album: The Beatles’ Abbey Road. That was the type of music most played in my home as a kid.


Bassist in PurpleThe closest one could ever be to a fly on the wall is as a concert photographer. I was in my mid-twenties the first time I ventured into that existence. It was an adventure. Coincidentally, the very first photos I ever took were, by pure chance, at First Avenue for a friend who happened to be playing that night with his band Alphatonic. Frankly, looking back, I don’t know what those guys saw in my photos from that night. But, whatever it was they saw, and seemed to like, it only encouraged me to continue. For better or for worse.

Through the next several years, I had the pleasure of meeting and photographing a number of very interesting and talented individuals and often had the privilege of participating in shenanigans and/or just being in places others were not authorized. This allowed me to be privy to a lot more than the average fan. It also aided in my role as a photographer—being an observer.

As an observer, I got to see how a band was born (periodically how they died or divorced), how they constructed their originals or deconstructed their favorite covers, how they prepared for a show, the tremendous amount of blood (often literally), sweat (definitely literally) and tears shed for a performance, as well as the tedious, meticulous and speedy precision required for setting up and tearing down gear.

There were lively and heated debates about various idols and inspirations—there were even trick questions you would be a blasphemer for daring to answer incorrectly (think Airheads and their question about Van Halen—yep, that happens in real life folks, the trick question that is, not the holding up of radio stations but that could be debatable if they’ve made some radio play on Loud and Local or KFAI—I jest, seriously, I scoff at you for even thinking otherwise).

However, the ultimate observation I was ever privy to was the emotion laid out before my eyes for which I don’t even think they were aware they had given me permission to see. I don’t even know if any of them really knew exactly how I saw them differently and often it wasn’t with the camera lens.

I’ve heard thousands of origin stories—every musician has one and most are ready and willing to tell it to you (including me—but that’s for another time). However, regardless of who it came from or what level of local success they achieved there was one story that remained the same—they were nothing in terms of success unless and until they “made it.” Whatever that means.

Generally I think they meant a contract with a known label and prime time radio play, but mostly I believe it was the desire to achieve the ability to do what they really loved, above all else, and still be able to pay the rent. That is, not having to be on a Ramen noodle diet and not having to worry about whether the tour van is going to breakdown or if they have enough gas to get to the next gig. Maybe, even some competent roadies.

In all their tribulations, I didn’t see failure or mediocrity, nor did I hear the musical flubs they claim to have made on stage. What I did see is that they were doing it, whether they saw or believed it themselves they were doing what they loved and made it work—every weekend (especially in the summer), occasionally on week nights (even if they had a “real” job to go to in the morning), and most of them were well known within subcultures of the local music scene, some even had a label and did more extensive touring. Even if their larger crowds and popularity were mainly within the metro area on home turf they were still living a dream to me.

In large part, many of them became inspiration in various forms. I didn’t see the “local” in what they were doing—I saw rock stars. I saw intelligent, creative, talented and passionate people whose skills were highly coveted by many. And when they were finished with a performance I often saw caring, compassionate people whose only real façade was that they wanted you to think they didn’t give a fuck.

Sure, I’ve met my fair share of conceited musicians and their level of technical skill is probably all that, but not always. Regardless, there is a certain intimacy that occurs with a band’s performance and their music will just resonate with you in some way. Those I have photographed, at some point, have all made me wish I could step inside their shoes—to be able to play guitar or drums (more likely drums, because I have always had a thing for drums) as well as they do and to be able to be up on that stage, under the lights. While the experience of performing is sometimes surreal, the experience of that performance is often just as surreal for the observant observer.

Knowing someone in the band personally makes that experience much more real because it humanizes the performers, especially for those who are close enough they can reach out and untie the guitarist’s boot, knowing full well he can’t do shit about it in that moment and without fear of getting punched in the face later.

Hmmm, perhaps this is the origin of my origin story after all!

“Sleep tight for me, I’m gone.”


The ThinkerAs with my previous musing, I have been thinking about this next thought for quite some time and for anyone in my inner circle it is nothing more than a written version of the word vomit they were initially subjected to. That being said, considering how easily we can be persuaded and/or manipulated by associative advertising I often wonder, can we really think for ourselves? I mean really think for ourselves? With all sincerity, I would love to believe we can but I also have to wonder if even the strongest of wills or minds amongst us are as independent in thought as we’d like to think we are.

Case in point: for as long as I can remember, families have had an informal gathering in the backyard with a charcoal or gas grill where the male head of household typically puts flame to slabs of protein. It’s just what we do, as people, as Americans; however, this ritual is merely the product of a very successful ad campaign. Excuse me? Yes, that’s right, an ad campaign. But, I don’t think most of us question this ritual or even consider its origins. I certainly never did—until a few years ago when seeking remodeling ideas for my 1955 rambler.

In the Post World War II housing boom, builders scrambled to get homes built for returning soldiers and their families. The depression was done and it was a new era—the general public no longer had to tighten the purse strings, the 40 hour work week was born, which allowed more time for leisurely activities on weekends, and union contracts provided paid vacations. (A Remodeling Handbook for Post World War II Houses, pg. 5).

The ideal homes being built were ramblers, designed to enhance the pleasure of this new-found leisure time by providing modern electrical appliances, large backyards and often had a sliding glass door that lead out to a patio which had a grill, of course.

Because American’s apparently just didn’t know how to relax and have a good time, they had to be taught and so, here in the Twin Cities, the Minneapolis Tribune published a magazine article every Sunday entitled “How to Have Fun in your Backyard.” (1952). It is from this we have the concept that Dad, or at least the male head, should be cooking steaks outside on the grill—because it’s “manly.” This idea, along with the modern appliances, were used to sell homes as “Futuristic” and “Very Different.” It was later defined as the epitome of the “American Dream” (A Remodeling Handbook for Post World War II Houses, pg. 5).

Sixty-four years later, we still participate in what is now a custom (with or without the rambler-style home). I am NOT saying this is a bad thing. Hell, I quite enjoy firing up the grill, hanging out on a nice summer or fall day (preferably when the mosquitos aren’t rampant), and letting the kids play in the comforts and convenience of my own space. What I am saying, however, is that this idea, as wonderful as it is, was not our own as individuals. It was an idea developed in a very smart advertising campaign designed for the sole purpose of selling homes. It was a way to get people to associate a relaxed way of life with that style of home and large yard. This movement was so successful it is the reason the Twin Cities suburbs grew as they did.

Just when you thought you were smart enough to avoid being manipulated by the advertisers (is the sarcasm, in Jaw’s-like fashion, dripping of the page yet?), you realize you’ve been living an ad campaign your entire life. Perhaps the question isn’t so much about whether we can physically think independently, but more about whether or not we can escape manipulation or influence from outside sources long enough to truly think independently.

Unfortunately, I am not sure that we can. Perhaps this is where my “disappointed idealist” turned cynic comes into play. Because I certainly used to think I was smart enough not to be fooled—turns out I have been—on a number of occasions and to various degrees. And, I’m not the only one (so, this fact doesn’t mean I’m not nearly as smart as I thought—or maybe it does).

For as many times as we claim we can see through the barrage of advertisements and claim that if, and only if, we are truly interested shall seek further information to make an “informed” choice, there are plenty more things we have bought into because we were persuaded in some way to do exactly what they wanted us to do. There are many products we use in everyday life (e.g. deodorant, teeth whiteners, certain prescription drugs, cosmetics and/or astringents) only because the need was manufactured. And somewhere, along the line from that 1950’s new-leisure era, we became consumers who bought the fabricated “American Dream” they sold and we practice it religiously.

Or, perhaps this is just the midnight rambling of a mind who, much like Jerry McGuire’s ill-gotten mission statement, was up too late and/or ate bad pizza. Tonight, this is as good as it’s going to get. “Sleep tight for me, I’m gone.”



Television-Advertising.jpg.pagespeed.ce.G5aPG0zbYXNothing chaps my hide more than realizing I have been lied to in a big way. Even worse yet is when I’ve realized I allowed myself to be duped. Everyone lies to some degree, whether to themselves or others; however, the central purpose here is neither to debate how honest we are or are not or the morality of truth-telling amongst individuals, nor to define the small lies we all tell, but to point out ways in which we are lied to on a grander scale. More specifically by companies through advertising.

My husband is currently taking a college course in advertising and while discussing a particular topic in class, because he was such a sport by listening to discombobulated introspection and participated in some pretty thought provoking discussions while I was finishing up the bachelor’s degree in philosophy, he remembered that one particular subject, related to business advertising, got me more fired up than usual. It is the act of a company and their advertiser fabricating a need they suggest you really ought to be worried about, but not too worried about because they can provide you with a solution—their product. It generates a lot of revenue and is very successful.

I realize that blogs have rules and that typically a blog is no more than a page, maybe a page and a half; unfortunately, the thoughts I have about the ethics in advertising cannot be so condensed (hopefully I haven’t lost you to something more interesting already). Because I tend to be a bit rebellious (as I age this is becoming less, but she still fights for existence) I’m going to post the term paper I wrote about associative advertising to the sexes, which defies the norm in this format due to length.

Please note, the below was written in approximately 2011 and since then Dove has a new ad campaign which appears as though they are attempting to be a socially responsible company by encouraging women to appreciate their bodies as they are.  Please also note, as I wrote the below I was just coming off of a very intense legal program and so my mind was geared towards the legality of things. Good luck.


I will be discussing associative advertising used to market products and boost profits and will specifically present a case study in which women have been shamed and manipulated into believing they should have certain insecurities about their bodies but have a way out of feeling those insecurities—buying a particular product. My position on this subject is that this type of advertising is unethical; however, I will discuss aspects of variables that come into play with advertising in general, such as legality versus the morality of said advertising and seemingly ethical messages used to persuade social thinking. I will first summarize the case study and then discuss the main ethical issue of losing autonomy. While I will focus mainly on the effects on women, I will briefly discuss the tactic used on men as in today’s society both sexes are being targeted.

 Advertising Cures for Unrealized Insecurities

On April 14, 2011, Slate Magazine posted an article “The Cure for Your Fugly Armpits” that shed light upon the advertising tactics of various companies marketing products to women in which they targeted various female body parts, created something the women should be worried about, and then offered them the cure—their product. The most recent case-in-point is Dove, a company widely known for marketing hygienic products such as soaps, moisturizers and deodorant.

Dove’s new ad campaign makes the suggestion to women that their armpits are not smooth enough and the unsmooth underarm is something they should be worried about because “If it’s news to you that this part of your body is not so hot, Dove says you’re in the minority, citing a survey in which 93 percent of women said they ‘think their underarms are unattractive.’” (Copeland, 2011). Dove is not the only culprit of inciting irrational fears to consumers. In fact, it is a historical concept (at least as far back at 1920 according to the article) wherein manufacturers have led women to believe they would be old maids, were not beautiful enough to hang on to a man because they are either too flawed and/or because there is another more beautiful woman, utilizing their product of course, who will steal your man right out from under you, or because your duty to please (i.e. have sex with) your man is or will be hindered because of the flaw they were ultimately selling a cure for.

Often the companies will create an imaginary condition that must raise concern—Listerine for example. Listerine began as a “surgical antiseptic,” but in the ‘20s the company hired a chemist who claimed it could be used as a mouth wash; thus the term halitosis” was born giving rise to the fear of bad breath echoing an official sounding medical-type term that doesn’t really exist. Their ad copy coined the old adage “always a bridesmaid, never a bride.” Lysol, on the other hand, an antibacterial spray now used to disinfect the air and various surfaces, was once marketed to women under a totally different usage—birth control and douche. At this point, not only did women have to worry about having flawless skin and faces to secure their beauty, but they also had to worry about germs in their mouth causing “halitosis” which gave off foul odors from the mouth to now having to worry about (1) the responsibility of preventing pregnancy and (2) thinking their intimate parts were hideously smelly, but at severe consequences: “The unfortunate truth was that as a contraceptive, Lysol was ineffective, not to mention dangerous. Improperly diluted, it burned and blistered the vagina, and in some cases even caused death.” (Copeland, 2011)

In today’s society, it’s not just women being targeted with false insecurities and cures for them. The recent Old Spice commercials, for example, suggest that men need worry that their women will run off with a more muscular, attractive, and better smelling man than he and while he will never look like the guy on television the smell of the cologne will sweep her off into exotic fantasies of such a man while remaining faithful to him. Or, let’s not forget, Viagra, which markets itself to older men claiming he must always be ready to have an erection at any given moment and prolong that erection so as to sexually satisfy his female partner because that’s what she really wants—essentially suggesting that he is inadequate without their “cure.”

What’s This All About?

All of the above are perfect examples of what John Waide considers “associative advertising” wherein a particular company persuades the general public, regardless of whether it is their target market, into believing that if they purchase product “X” they will smell better, look better, be popular, sexy, etc. This tactic works by showing the viewer images unrelated to the product itself which allows the viewers’ unconscious mind to “associate” the product with that image—the beach and a Corona beer, for example. The viewer will unconsciously perceive that drinking a Corona beer will be as relaxing as sitting on a warm, sunny beach without the advertiser making any concrete claim to it.

In the case of marketing particular beauty and/or hygienic products to women the immorality of the advertisements comes from a company looking to increase its sales with little care for their consumer, such as potential physical and/or emotional harms. In return the consumer gets to feel insecure about their body and appearance and eventually, if not immediately, lose their autonomy when peer pressure to look or smell (or not smell) a certain way sets in.  Associative advertising is not strictly inclusive of marketing a product—it can be an effective tool to “market” an idea, such as not drinking and driving, not texting while driving and smoking cigarettes as being extremely bad for the smoker and everyone around them by associating images of death and/or destruction. What makes this effective is that it propagandizes people to make social change (e.g. people sneer and make offensive comments to others because the behavior or look is no longer socially accepted).

The criteria for associative advertising is set out in such a way that “the advertiser wants people to buy….largely independent of any sincere desire to improve or enrich the lives” of said people, “identifies some…deep-seated non-market good for which the people in the target market feel a strong desire,” such as “friendship, acceptance and esteem of others,” for which the “desire for the non-market good is intensified by calling into question one’s acceptability,” like hygienic practices, and often the consumer receives only “partial satisfaction to the non-market desire.” The case study in which women were utilizing Lysol that in the end physically harmed their intimate parts and was not effective as a birth control method, despite the advertiser’s claims, is a perfect example of a company having no compassion for its consumer and, unfortunately, “it is quite common for advertisers in the U.S.A. to concentrate their attention on selling something that is harmful…” (Waide, 1987)

Legal vs. Ethical

The mere fact that a company, such as Lysol, sold a product under a false premise (i.e. that it was an effective contraceptive and douche) should be appalling enough. Major pharmaceutical companies push drugs by inciting fear into the public, making them insecure about a minute anomaly and vague symptoms which then get people to pester their doctors (or doctor shop) until they get the new drug, for which, as it turns out, the “magic” pill caused more harm than it did good and in some cases even caused death; hence numerous class action lawsuits. However, one man’s morality is not another’s and in this country what some (or even all) consider unethical or immoral may not necessarily be illegal, nor would we necessarily want it to be.

Advertising is a form of speech and each company has a legal right to inform potential and/or existing consumers about their product. If we believe the government is not in the business of “protecting us from ourselves,” which is essentially handing over our autonomy to it, we cannot make associative advertising illegal, despite the immorality of it. We can, however, hold companies legally responsible to be more honest in their advertising by invoking basic contract law in that if a company makes a claim that product “X” will do “Y” and doesn’t then they have breached their implied contract by not delivering a “good according to the stated terms.” (Machan, 1987) Unfortunately, companies have good attorneys in their pockets and can find a way around this by making implicit suggestions rather than an outright statement or promise of said good, thus I don’t think combating the advertisers via contract law is sufficient.


One should wonder if today’s definition of beauty, idea of how life should be lived in general, and hygienic concerns (i.e. whether we think certain things stink because we are conditioned to believe so) are all the product of a long campaign of associative advertising. If we consider the high rates of plastic surgery going on (ranging from simple Botox injections to major surgical lifts and tucks) I would have to venture a guess that it is—that it is defining who and what we think we are which is not autonomous at all because it is the advertisers making those definitions for us and it is no easy task to become a social outcast by choosing not to adopt the concepts as advertised to us. And, because associative advertising is dictating definitions of who we ought to be with their products or ideas, along with changing socially accepted ideology, we have little to no choice which makes the practice unethical despite the legality.

Image credit:

Part I—What is a Cynic?

George Carlin-Cynic Today when people think about the word “cynic” they typically understand it to be negative as it is often used in that connotation. To some degree this accusation has some merit. According to Merriam Webster the modern definition of cynic is:

“a person who has negative opinions about other people and about the things people do; especially: a person who believes that people are selfish and are only interested in helping themselves.” Or, “a faultfinding captious critic; especially: one who believes that human conduct is motivated wholly be self-interest.”

However, it is also a Greek philosophical ideology which holds that:

“Virtue is the only good and that its essence lies in self-control and independence.”

Ultimately, cynicism is the idea that one must control various appetites and emotions (often lumped into “appetites,” but commonly referred to as “passions”) in order to attain the appropriate level of various virtues. The definition of virtue is often misconstrued and very convoluted itself. For the purposes of this particular blog I will not address what virtue is—otherwise, I will be up all night—we’ll save it for another. Being a cynic also means that, those who embrace the label believe that human beings are motivated by self-interest (the merits of which are also very debatable on its own). While it may seem entirely negative, it is not as a whole. A cynic understands the humanistic flaws we all carry. We humans are not perfect, nor will we ever be. In understanding our faults, the cynic can see the holes in certain theories where others can be blinded by their own confirmation biases. That does not mean, or even that I believe, that a cynic is more correct than any other particular way of thought. I merely point out that, while others see ALL negativity, others see it as being more realistic—embracing the fact that we have biases, that, for example, we cannot possibly know the “mind” of God because we, ourselves, are not “all knowing” (which is 1/3 of the definition of how we define the Almighty), and choose to focus on what we, as an individual actually have control over. Now, in the wake of political furor as of late I have seen a number of people post things online that speak to the idea that one ought to “think for themselves” rather than letting mass mainstream media dictate to them and have made accusations that those who do not do this are “sheeple.” That sentiment alone speaks to the idea that “Cynics [seek] self-sufficiency and [reject] the social and religious values of civilization. Group thinking is thought of as herd-thinking.” (emphasis added). If that thought or idea has been in your head or come out of your mouth, well then: how very cynical of you! Unfortunately, cynicism is a little more complex than what I have described because there are branches of it—sort of like a spinoff from your favorite television show and the further down the line it goes, the more complex and distorted from its original path it becomes. For example, some follow that seeking out individualistic pleasures that are not harmful to others, even if wrought with self-interest, is a good thing and if it is good then it is virtuous. This falls closely in line with cynicism, but because it can be argued that this thought removes the “self-control” piece it becomes hedonism instead. I have often been accused of being a cynic and have decided to embrace it. I do value my independence immensely, and strive for self-control over my passions or appetites (some days this is easier said than done, especially when it comes to CHOCOLATE!). However, as mentioned above, relative to complexity, I am not wholly a cynic all the time in everything. I confess, there is a hopeless romantic lurking in me somewhere. My father once told me that a “cynic is a brokenhearted idealist.” I do not know if that idea is wholly his own or of someone he was quoting. Regardless, I think he may be on to something. For if a certain ideal is held closely in the heart and mind of the individual it could, quite possibly, be heart breaking, especially with a hopeless romantic, when one realizes that the ideal is not what it seemed and/or that the world just cannot live up to its expectations. Including the idea that: “Virtue is the only good and that its essence lies in self-control and independence.” Especially considering humans have, now and through the ages, had an extremely hard time defining and agreeing upon what virtue is and seem to choose to redefine it as it suits us in a given time (my long-held belief is that truth is only a matter of dates, but I digress) and are self-control challenged every single day (anyone who denies the struggle with controlling various passions and appetites is lying) and we don’t always know what independence really is or what to do with it (though independence tends to be in the “eye of the beholder,” that is, subjective). So then, it would make sense that a “broken hearted idealist” would become more negative in their reasoning when realizing that their idea of what is good, moral, ethical or virtuous turns out to be something else entirely. As the saying goes, “if the shoe fits,” and so with cynicism, like I wear many hats, I have many shoes and will wear it. With that I bid you goodnight. “Sleep tight for me, I’m gone.” p.s. Apparently, after a bit of research, it was George Carlin who claimed cynics were disappointed idealists–I think “brokenhearted” is more fitting. Image credit:

Don’t forget to find me on Facebook and “like” my page.