Regulars

Half Baked

Petit Tyrants

The New Birds of Prey in Modern Retail: A Sector of Dysfunctional Organizations

In a dysfunctional organization, the shared pathology fortifies its defense mechanisms with an obstinacy that appears rock-solid. Lines such as, “Unfortunately, the product cannot be returned” can be seen as part of an egg shell that seems to be durable until it is cracked open. By analogy, cracking the egg entails parsing such lines as are typically dished out to outsiders. Let’s take a look.


First, the word cannot is incorrectly used in the sentence, for a store’s return policy is not a law; whereas a business is subject to a law, no such requirement pertains to a company’s own policies. To treat the latter as tantamount to laws is essentially to vaunt the self-importance of the company, store, and even the employee enunciating the policy-law. “Your policies are not laws; of course you can make an exception, even if your supervisor is the person who can do it.” The word cannot serves the organizational dysfunction by giving the impression externally that the company is more than it is (i.e., a state of sorts with its own laws). At the employee level, the devise is essentially a power-grab—a distended or exaggerated urge to control others being a typical symptom of insecurity borne of underlying weakness.


Substituting may not for cannot gives the customer at least an implied opening that from the standpoint of the organizational pathology could prompt him or her to push through the defense mechanisms and trounce the core weakness. Were an “upper” manager to even consider such a linguistic change, an underling would likely contend that every customer would be returning items. The fallacy in this reasoning goes by the wayside in the mechanizations of the organizational dysfunction. Indeed, ignoring or dismissing logical fallacies is itself one of the defense mechanisms!  This is arguing with such an employee or manager is apt to be an exercise in futility.


Notice the “exchanges accepted” instead of “accept exchanges.” The passive voice extends to “are revised” and “are shipped,” suggesting an underlying mentality of weakness. Also, the forcefulness of the “must” (as one might expect pertains to a law) is belied by the inherent subjectivity in “mint condition.” Finally, the blood-red color, as well as the black background color, suggests a certain passive-aggressiveness in line with the use of “must” for what is actually a policy (rather than a law). (Image Source: omgmiamiswimsuits.com)

Second, the use of the passive mood also hints at the underlying weakness in the dysfunctional organization because the action is sidestepped. “We do not accept returned items” highlights the action (i.e., the refusing) of the company’s employees/managers whereas “cannot be returned” omits the actor entirely! The latter phraseology matches the insecurity that naturally manifests out of weakness. Besides implying that the power of the actor to act is in some way compromised or enervated, the passive voice hides the actor and thus protects him or her from being confronted. The actor’s underlying fear here is that a head-to-head clash would not end well for the actor, given his or her own and shared (i.e., organizational) pathology.


Third, just as the choice of the word cannot and the use of the passive voice both involve a manipulatory fabrication (i.e., lying with a hidden agenda), so too does the addition of the unnecessary adverb, unfortunately. For the stealth actor (i.e., the non-supervisory or managerial employee), the policy is hardly unfortunate; otherwise, the policy would not be “on the books.” Lest the adverb is intended to refer to the customer [rather, lest the policy’s formulator or the employee mouthing it is referring to the customers]—as though the fact that the customers would have to keep ill-suited products were unfortunate—the policy itself indicates just how much its formulator and implementers really sympathize, especially since the formulator can change the policy. In other words, the use of the word is a lie designed to give the customer the false impression that “the store” really cares and that unfortunately the policy cannot be changed (and by whom?).


Even the tactic itself in such a line is a lie in that the pathogens are utterly unwilling to tolerate the very same tactic directed back at them. A customer wanting more than a glimpse of the sickness need only reply, “Unfortunately any refusal to accept back the deformed item will have to be turned over to small claims court.” Even though the particular employee would have no involvement in such legal proceedings, and thus no rational reason to bristle at the customer’s stated policy/law, he or she would be too accustomed to dictating terms to customers to let that privilege lapse without at least a spike in anger and attempt to regain the upper hand. “You are free to do so,” an employee might retort, as though he or she were granting or allowing the freedom. In fact, the implication is rather arrogant, again as if the company were akin to a state rather than being a mere counter-party in a commercial exchange. 

The hidden agenda becomes apparent by realizing that the statement is duplicitous or redundant, as the customer obviously already knows that he or she has the liberty to sue. The employee knows this of course, either consciously or unconsciously, and is not really informing the customer that he or she can go to small-claims court. “I don’t need you to tell me I can do what I’ve already told you I know I am free to do,” an astute customer might retort in turn. 

The exchange of words is really a control battle stemming from an organizational pathology’s attempts to defend itself against potentially interlarding intruders. The threat is of course over-stated—hence the exaggerated intent to “nail down the hatches” to weather the perceived storm. Yet the “new birds of prey”—Nietzsche’s label for those among the weak who can’t resist their urge do dominate (even the strong)—are not content to merely defend, for they must have the upper hand in order to feel sufficiently protected. 

Continue reading The New Birds of Prey in Modern Retail: A Sector of Dysfunctional Organizations

The New Birds of Prey in Modern Retail: A Sector of Dysfunctional Organizations

In a dysfunctional organization, the shared pathology fortifies its defense mechanisms with an obstinacy that appears rock-solid. Lines such as, “Unfortunately, the product cannot be returned” can be seen as part of an egg shell that seems to be durable until it is cracked open. By analogy, cracking the egg entails parsing such lines as are typically dished out to outsiders. Let’s take a look.


First, the word cannot is incorrectly used in the sentence, for a store’s return policy is not a law; whereas a business is subject to a law, no such requirement pertains to a company’s own policies. To treat the latter as tantamount to laws is essentially to vaunt the self-importance of the company, store, and even the employee enunciating the policy-law. “Your policies are not laws; of course you can make an exception, even if your supervisor is the person who can do it.” The word cannot serves the organizational dysfunction by giving the impression externally that the company is more than it is (i.e., a state of sorts with its own laws). At the employee level, the devise is essentially a power-grab—a distended or exaggerated urge to control others being a typical symptom of insecurity borne of underlying weakness.


Substituting may not for cannot gives the customer at least an implied opening that from the standpoint of the organizational pathology could prompt him or her to push through the defense mechanisms and trounce the core weakness. Were an “upper” manager to even consider such a linguistic change, an underling would likely contend that every customer would be returning items. The fallacy in this reasoning goes by the wayside in the mechanizations of the organizational dysfunction. Indeed, ignoring or dismissing logical fallacies is itself one of the defense mechanisms!  This is arguing with such an employee or manager is apt to be an exercise in futility.


Notice the “exchanges accepted” instead of “accept exchanges.” The passive voice extends to “are revised” and “are shipped,” suggesting an underlying mentality of weakness. Also, the forcefulness of the “must” (as one might expect pertains to a law) is belied by the inherent subjectivity in “mint condition.” Finally, the blood-red color, as well as the black background color, suggests a certain passive-aggressiveness in line with the use of “must” for what is actually a policy (rather than a law). (Image Source: omgmiamiswimsuits.com)

Second, the use of the passive mood also hints at the underlying weakness in the dysfunctional organization because the action is sidestepped. “We do not accept returned items” highlights the action (i.e., the refusing) of the company’s employees/managers whereas “cannot be returned” omits the actor entirely! The latter phraseology matches the insecurity that naturally manifests out of weakness. Besides implying that the power of the actor to act is in some way compromised or enervated, the passive voice hides the actor and thus protects him or her from being confronted. The actor’s underlying fear here is that a head-to-head clash would not end well for the actor, given his or her own and shared (i.e., organizational) pathology.


Third, just as the choice of the word cannot and the use of the passive voice both involve a manipulatory fabrication (i.e., lying with a hidden agenda), so too does the addition of the unnecessary adverb, unfortunately. For the stealth actor (i.e., the non-supervisory or managerial employee), the policy is hardly unfortunate; otherwise, the policy would not be “on the books.” Lest the adverb is intended to refer to the customer [rather, lest the policy’s formulator or the employee mouthing it is referring to the customers]—as though the fact that the customers would have to keep ill-suited products were unfortunate—the policy itself indicates just how much its formulator and implementers really sympathize, especially since the formulator can change the policy. In other words, the use of the word is a lie designed to give the customer the false impression that “the store” really cares and that unfortunately the policy cannot be changed (and by whom?).


Even the tactic itself in such a line is a lie in that the pathogens are utterly unwilling to tolerate the very same tactic directed back at them. A customer wanting more than a glimpse of the sickness need only reply, “Unfortunately any refusal to accept back the deformed item will have to be turned over to small claims court.” Even though the particular employee would have no involvement in such legal proceedings, and thus no rational reason to bristle at the customer’s stated policy/law, he or she would be too accustomed to dictating terms to customers to let that privilege lapse without at least a spike in anger and attempt to regain the upper hand. “You are free to do so,” an employee might retort, as though he or she were granting or allowing the freedom. In fact, the implication is rather arrogant, again as if the company were akin to a state rather than being a mere counter-party in a commercial exchange. 

The hidden agenda becomes apparent by realizing that the statement is duplicitous or redundant, as the customer obviously already knows that he or she has the liberty to sue. The employee knows this of course, either consciously or unconsciously, and is not really informing the customer that he or she can go to small-claims court. “I don’t need you to tell me I can do what I’ve already told you I know I am free to do,” an astute customer might retort in turn. 

The exchange of words is really a control battle stemming from an organizational pathology’s attempts to defend itself against potentially interlarding intruders. The threat is of course over-stated—hence the exaggerated intent to “nail down the hatches” to weather the perceived storm. Yet the “new birds of prey”—Nietzsche’s label for those among the weak who can’t resist their urge do dominate (even the strong)—are not content to merely defend, for they must have the upper hand in order to feel sufficiently protected. 

Continue reading The New Birds of Prey in Modern Retail: A Sector of Dysfunctional Organizations

Social Media Marketing: The Social Element as an End in Itself

In the religious domain, some people struggle with the inherent incongruity of acting selflessly while believing that the righteous are rewarded in heaven. Resolving this oxymoron in practical as well as theoretical terms may come down to “one hand not knowing what the other hand is doing.” Whether innate or a “learned skill,” disentangling a practice from any hope of reward can be applied to social media marketing. This application is easier said than done, especially in a culture of greed saturated with opportunism at every opportunity as a strong norm and custom. Indeed, the underlying question may be whether a “strong personality” once well-engrained is able, not to mention willing, to “park itself out back” if even for a much-needed break.


Gary Vaynerchuk, the author of several books on social media marketing, preaches a two-step approach, which can be characterized as the marketer becoming a native in whichever (social media) platforms he or she is in and then consummating the (ultimately) desired transaction. Ideally, the selling fuses with becoming a native (or recognized as one), so the two phases are “phased” into one.


Crucially, being able to come across as a native is not the same as “going native.” Whether in business or government, putting up a front in order to be perceived by the masses as one of them is not the same as being one of them. Even though Vaynerchuk emphasizes the need to respect the nuances of a given social media platform (e.g., values and mannerisms), he may be interpreted by some readers as maintaining that presenting the appearance of respect is sufficient to “become” a native, at least for marketing purposes. In other words, a marketer need not “go native”; going through the motions is sufficient as long as the other participants believe that the entrant is satisfying their social or informational objectives.  


Unfortunately, learning how to come across as a native in sync with a platform’s distinctive “cultural” mores and norms may be too short-sighted not only in terms of “going native,” but also in achieving marketing objectives. In fact, the approach itself may be too self-serving—too close to those objectives—to render the marketer as a native. Positing a distance between engaging the social element and being oriented to making the sale, Vaynerchuk advises that in contributing to the social or informational dialogue at the outset, “you’re not selling anything. You’re not asking your consumer for a commitment. You’re just sharing a moment together.”[1]The experience shared is essentially an end in itself, eclipsing any further motive, as in to sell a product or service.


It may seem rather strange to find a marketer oriented to exploiting any opportunity “just sharing a moment together” with electronic strangers as an end in itself; serial opportunists in an enabling cultural context are used to treating other rational beings as mere means at any opportunity, rather than as ends in themselves (i.e., violating Kant’s Kingdom of Ends version of his Categorical Imperative). Vaynerchuk may undercut his own depiction of the shared experience as sufficient unto itself by reminding his readers that the “emotional connection” they “build through [participating in social or informational dialogue without selling but to “become” a native] pays off [when they] decide to throw the right hook [i.e., make the sale’s pitch and consummate a transaction].”[2]With such a payoff in the offing, I doubt that virtually any marketer oriented to “maximizing” any opportunity to tout, brag, or hard-sell would just share an emotional connection at a moment without being motivated by, or at least mindful of, the hidden agenda.

A “stakeholder model” approach to social-media marketing. This framework is inherently self-centric, whereas a web-like structure would be more in line with “shared experiences.” Both frameworks are distinct, and yet can be managed, or related, such that neither encroaches on the other unduly.
(Image Source: irisemedia.com)

As difficult as it may be for a marketing personality to simply share a moment with another human being—especially a stranger narrowly glimpsed through electronic means—Vaynerchuk rightly situates the feat as a requirement for “going native,” and thus, ironically, for being able to ultimately make the sale. In the context of authentic social and informational reciprocity in a given social-media platform, a wax figure easily stands out. Even so, all too many marketers come across as stiff, or contrived, in social media as if self-centeredness and lying advances rather than detracts from sales. Hence, I suspect that a rather intractable marketing personality and a related and thus enabling culture, such as that of American business, stands in the way of business being able to fully integrate social-media marketing.[3] 

Similar to why it is difficult to fall asleep without taking a break from trying to do so, marketers have trouble not letting their marketing hand know what their other hand is doing. At the very least, managers overseeing marketing would need to permit and even encourage the marketer(s) tasked with social-media marketing to spend time online without worrying about having to sell anything (even oneself). In hiring such marketers, managers ought to highlight rather than sideline those applicants who enjoy being on a social media platform.



[1]Gary Vaynerchuk, Jab, Jab, Jab, Right Hook (New York: Harper, 2013), p. 22.

[2]Ibid, p. 23.

[3] The fixation on using any opportunity to sell one’s wares is exemplified in CNBC’s Jim Cramer’s choice of response as another host mentioned on March 14, 2014  that Jim had worked that weekend at his restaurant. Rather than share the moment by regaling his colleagues and the viewers with a tale of something enjoyable from his weekend at his restaurant, he remarked as if by script that he had worked that weekend because “we were trying out a new chicken sandwich” and a new drink. The sheer contrivance belied any semblance of authentic passion, as might be realized in relishing simply experiencing being in his restaurant (e.g., the atmosphere) and later telling people about it instead of selling as if it were an end in itself. Underneath the obsession with getting as much as possible from any opportunity is greed, a motive and value that knows no limitation. Ultimately, it is a well-worn grove that keeps marketers from “going native” and thus being able to fully inhabit social media. 

Continue reading Social Media Marketing: The Social Element as an End in Itself

Social Media Marketing: The Social Element as an End in Itself

In the religious domain, some people struggle with the inherent incongruity of acting selflessly while believing that the righteous are rewarded in heaven. Resolving this oxymoron in practical as well as theoretical terms may come down to “one hand not knowing what the other hand is doing.” Whether innate or a “learned skill,” disentangling a practice from any hope of reward can be applied to social media marketing. This application is easier said than done, especially in a culture of greed saturated with opportunism at every opportunity as a strong norm and custom. Indeed, the underlying question may be whether a “strong personality” once well-engrained is able, not to mention willing, to “park itself out back” if even for a much-needed break.


Gary Vaynerchuk, the author of several books on social media marketing, preaches a two-step approach, which can be characterized as the marketer becoming a native in whichever (social media) platforms he or she is in and then consummating the (ultimately) desired transaction. Ideally, the selling fuses with becoming a native (or recognized as one), so the two phases are “phased” into one.


Crucially, being able to come across as a native is not the same as “going native.” Whether in business or government, putting up a front in order to be perceived by the masses as one of them is not the same as being one of them. Even though Vaynerchuk emphasizes the need to respect the nuances of a given social media platform (e.g., values and mannerisms), he may be interpreted by some readers as maintaining that presenting the appearance of respect is sufficient to “become” a native, at least for marketing purposes. In other words, a marketer need not “go native”; going through the motions is sufficient as long as the other participants believe that the entrant is satisfying their social or informational objectives.  


Unfortunately, learning how to come across as a native in sync with a platform’s distinctive “cultural” mores and norms may be too short-sighted not only in terms of “going native,” but also in achieving marketing objectives. In fact, the approach itself may be too self-serving—too close to those objectives—to render the marketer as a native. Positing a distance between engaging the social element and being oriented to making the sale, Vaynerchuk advises that in contributing to the social or informational dialogue at the outset, “you’re not selling anything. You’re not asking your consumer for a commitment. You’re just sharing a moment together.”[1]The experience shared is essentially an end in itself, eclipsing any further motive, as in to sell a product or service.


It may seem rather strange to find a marketer oriented to exploiting any opportunity “just sharing a moment together” with electronic strangers as an end in itself; serial opportunists in an enabling cultural context are used to treating other rational beings as mere means at any opportunity, rather than as ends in themselves (i.e., violating Kant’s Kingdom of Ends version of his Categorical Imperative). Vaynerchuk may undercut his own depiction of the shared experience as sufficient unto itself by reminding his readers that the “emotional connection” they “build through [participating in social or informational dialogue without selling but to “become” a native] pays off [when they] decide to throw the right hook [i.e., make the sale’s pitch and consummate a transaction].”[2]With such a payoff in the offing, I doubt that virtually any marketer oriented to “maximizing” any opportunity to tout, brag, or hard-sell would just share an emotional connection at a moment without being motivated by, or at least mindful of, the hidden agenda.

A “stakeholder model” approach to social-media marketing. This framework is inherently self-centric, whereas a web-like structure would be more in line with “shared experiences.” Both frameworks are distinct, and yet can be managed, or related, such that neither encroaches on the other unduly.
(Image Source: irisemedia.com)

As difficult as it may be for a marketing personality to simply share a moment with another human being—especially a stranger narrowly glimpsed through electronic means—Vaynerchuk rightly situates the feat as a requirement for “going native,” and thus, ironically, for being able to ultimately make the sale. In the context of authentic social and informational reciprocity in a given social-media platform, a wax figure easily stands out. Even so, all too many marketers come across as stiff, or contrived, in social media as if self-centeredness and lying advances rather than detracts from sales. Hence, I suspect that a rather intractable marketing personality and a related and thus enabling culture, such as that of American business, stands in the way of business being able to fully integrate social-media marketing.[3] 

Similar to why it is difficult to fall asleep without taking a break from trying to do so, marketers have trouble not letting their marketing hand know what their other hand is doing. At the very least, managers overseeing marketing would need to permit and even encourage the marketer(s) tasked with social-media marketing to spend time online without worrying about having to sell anything (even oneself). In hiring such marketers, managers ought to highlight rather than sideline those applicants who enjoy being on a social media platform.



[1]Gary Vaynerchuk, Jab, Jab, Jab, Right Hook (New York: Harper, 2013), p. 22.

[2]Ibid, p. 23.

[3] The fixation on using any opportunity to sell one’s wares is exemplified in CNBC’s Jim Cramer’s choice of response as another host mentioned on March 14, 2014  that Jim had worked that weekend at his restaurant. Rather than share the moment by regaling his colleagues and the viewers with a tale of something enjoyable from his weekend at his restaurant, he remarked as if by script that he had worked that weekend because “we were trying out a new chicken sandwich” and a new drink. The sheer contrivance belied any semblance of authentic passion, as might be realized in relishing simply experiencing being in his restaurant (e.g., the atmosphere) and later telling people about it instead of selling as if it were an end in itself. Underneath the obsession with getting as much as possible from any opportunity is greed, a motive and value that knows no limitation. Ultimately, it is a well-worn grove that keeps marketers from “going native” and thus being able to fully inhabit social media. 

Continue reading Social Media Marketing: The Social Element as an End in Itself

Meteorology vs. Astronomy: Is It Spring Yet?

Advancing clocks an hour ahead to Daylight Savings Time conveniently announces itself as the easy-to-remember Spring Forward. Advancing democracy in the Middle East in the early years of the 2010s proclaimed to the world the Arab Spring. Advancing global warming foretells earlier springs encroaching on softened winters. Even as spring blooms in the sights of the popular press, the media quite stunningly often stumbles over when the season begins. The groundhog is no help, differing from year to year on whether spring begins four or six weeks from February 2nd. Astonishingly—and in no small measure my impetus in writing here out of no less than dumbfounded curiosity—even television meteorologists play to the popular ignorance, willingly succumbing to the common practice of taking astronomical “spring” as meteorological spring too. The “professionals’” declaratory tone alone reveals just how certain human beings can be even of presumed knowledge lacking any real foundation.  Sadly, this mentality of assertion, having become so widespread, or ubiquitous, in modern society, is virtually invisible to us; and yet, the shrill of the epistemological missionary zeal reverberates from no less than modernity’s default: faith in one’s own use of reason. In this essay, I present the first day of spring as a case in point rather than make the entire argument.


Sometime during the first week of March 2014, as yet another front of frigid Arctic air charged southward through North America, various weather personalities on television newscasts relished in the apparently startling fact that spring was just two weeks away. Viewers looking out at snow-covered landskips as far south as Kansas City could marvel at the return of nature’s colors and smells so soon. Most years, the grass is green there by the Ives of March.


Even as the popularly broadcast juxtaposition made for good copy, meteorological spring in the Northern Hemisphere had already come—that is to say, with regard to weather and climate. According to the U.S. National Oceanic and Atmospheric Administration, “(m)eteorologists and climatologists break the seasons down into groupings of three months based on the annual temperature cycle as well as our calendar. . . . Meteorological spring includes March, April, and May; meteorological summer includes June, July, and August; meteorological fall includes September, October, and November; and meteorological winter includes December, January and February.”[1]Therefore, by the first week of March 2014, spring had already arrived as far as the weather is concerned even as television meteorologists were publicly pointing to March 20th as the first day. 

Even calling so much attention to the first day, as if suddenly the northern climes of the contiguous United States would suddenly return their fauna and flora to their other half-lives on that day, is horribly misleading. Assuming that the meteorologists were well aware that spring weather data begins on March 1st of each year in the U.S., the next-most plausible explanation may be found in the lazy assumption that it is easier to go with popular misconceptions than expend the effort to stare one in the face and overcome its stolid inertia head-on (the excuse being not wanting to cause confusion).


As a result, Americans are left with the incredibly incongruent “expert” assertion that summer begins not with Memorial Day, but just a couple of weeks before July 4thon June 21st of each year. Essentially, we are to believe that summer begins in the middle of summer! That such a logical and experiential absurdity can long endure in spite of evidence to the contrary is itself evidence of just how much cognitive dissidence human beings are willing to endure in the face of declarations from perceived expertise. In other words, an erroneous or outdated status-quo societal default has tremendous hold even in the age of (rationalist) Enlightenment (i.e., from the fifteenth-century Renaissance period).


Lest it be said that the enabled popular misconception came spontaneous out of nothing ex nihilo, the basis of the confusion lies in the rather stupid decision to apply the names of the meteorological seasons (i.e., fall, winter, spring, and summer) to the four quadrants of the Earth’s orbit around the sun. Whereas the meteorological seasons are based on the annual temperature cycle applied to the calendar, “astronomical seasons are based on the position of the Earth in relation to the sun.”[2]Due to the tilt of the planet, solar energy is maximized in the Northern and Southern Hemispheres in different parts of the planet’s orbit. To label a certain interval of space as “spring” is not just highly misleading; the label is a category mistake, for the climatic seasons on Earth do not exist in the void of space.[3]


Astronomy is distinct from weather, even though the two are related (i.e., not disparate).
(Image source: NASA)

Put another way, astronomical “spring” in the Northern Hemisphere refers to the portion of the Earth’s orbit from the point at which the vertical rays from the Sun hit the Earth on its equator (on the “Spring” Equinox, usually on March 21st) to point when the vertical rays are on the Tropic of Capricorn (the furthest north the vertical rays go, on the “Summer” Solstice, usually on June 21st). In fact, Summer Solstice is better translated as the highpoint rather than beginning of summer. That is to say, the sun reaches its highest arc in the Northern sky on June 21st, which is neither the pinnacle nor beginning of summer in terms of temperatures.[4]
 

In short, the piercing pronouncements on the public air-waves of the beginning of spring (and then three months later of summer) ring hollow. Nevertheless, the meteorologists who trumpet the good news do so year after year, as if deer caught in a car’s headlights (or speaking to such deer!). Perhaps the fix is as simple as changing the names of the Earth’s orbit’s four parts so they are not likened to climatic seasons. The puzzle would doubtless still present itself as to how it is that nonsensical claims can long endure as a societal (or even global) default, taken for granted in a way that strangely wards off reason’s piercing rays and those of our own experience. Something is oddly off in how human beings are hard-wired.



[1] National Climatic Data Center, “Meteorological Versus Astronomical Summer—What’s the Difference?” National Oceanic and Atmospheric Administration, June 21, 2013 (accessed March 9, 2014).

[2]Ibid., italics added.

[3] As another example of a mislabeling that should have been known to trigger much confusion and even false claims, the three law instructors from Harvard who founded the law school at the University of Chicago at the beginning of the twentieth century should have known better than to replace the name of the bachelors in law, the L.L.B. (i.e., bachelors in the letters of law), with a name implying a doctorate (the J.D., or juris doctor). The actual (professional and academic) doctorate in Law is the J.S.D., the doctorate in juridical science, of which the LL.B., or J.D., along with the LL.M. (masters), is a prerequisite and thus not possibly a doctorate in itself. A doctoral degree must be the terminal degree in a school of knowledge, have comprehensive exams in a discipline of said knowledge (graded by professors rather than an industry regulatory body), and include a significant work of original research (i.e., a book-length study, except in a quantitative or scientific field) that the candidate defends before a committee of faculty. Yet how many Americans correct an American lawyer who declares himself to be a doctor?  The same goes for the M.D. as well (a program of survey-courses followed by a couple years of seminars—the typical substance of a bachelors program), and yet how many physicians and surgeons presume themselves entitled to use the doctoral title (Dr.) even as they dismiss the valid appellations that holders of the Ph.D., J.S.D., D.Sci.M. (Doctorate in the Science of Medicine), D.B.A. (business), D.D. (divinity/theology), and D. Ed. (education) use as per the rights and privileges of these doctoral degrees?  Meanwhile, the general public goes on grazing as if the snow were green grass.

[4]The word solstice in English comes from the Latin word, solstitium, which combines sol (sun) and stit (from sistere, to make stand).  In other words, the sun is made to stand (highest) in the Northern Hemisphere on June 21st of each year. Nothing is thus implied about any beginning; rather, the implication is that of a pinnacle or high point. Yet even in this sense, meteorological summer is different, for its high point in terms of temperature comes in mid to late July. 

Continue reading Meteorology vs. Astronomy: Is It Spring Yet?

Meteorology vs. Astronomy: Is It Spring Yet?

Advancing clocks an hour ahead to Daylight Savings Time conveniently announces itself as the easy-to-remember Spring Forward. Advancing democracy in the Middle East in the early years of the 2010s proclaimed to the world the Arab Spring. Advancing global warming foretells earlier springs encroaching on softened winters. Even as spring blooms in the sights of the popular press, the media quite stunningly often stumbles over when the season begins. The groundhog is no help, differing from year to year on whether spring begins four or six weeks from February 2nd. Astonishingly—and in no small measure my impetus in writing here out of no less than dumbfounded curiosity—even television meteorologists play to the popular ignorance, willingly succumbing to the common practice of taking astronomical “spring” as meteorological spring too. The “professionals’” declaratory tone alone reveals just how certain human beings can be even of presumed knowledge lacking any real foundation.  Sadly, this mentality of assertion, having become so widespread, or ubiquitous, in modern society, is virtually invisible to us; and yet, the shrill of the epistemological missionary zeal reverberates from no less than modernity’s default: faith in one’s own use of reason. In this essay, I present the first day of spring as a case in point rather than make the entire argument.


Sometime during the first week of March 2014, as yet another front of frigid Arctic air charged southward through North America, various weather personalities on television newscasts relished in the apparently startling fact that spring was just two weeks away. Viewers looking out at snow-covered landskips as far south as Kansas City could marvel at the return of nature’s colors and smells so soon. Most years, the grass is green there by the Ives of March.


Even as the popularly broadcast juxtaposition made for good copy, meteorological spring in the Northern Hemisphere had already come—that is to say, with regard to weather and climate. According to the U.S. National Oceanic and Atmospheric Administration, “(m)eteorologists and climatologists break the seasons down into groupings of three months based on the annual temperature cycle as well as our calendar. . . . Meteorological spring includes March, April, and May; meteorological summer includes June, July, and August; meteorological fall includes September, October, and November; and meteorological winter includes December, January and February.”[1]Therefore, by the first week of March 2014, spring had already arrived as far as the weather is concerned even as television meteorologists were publicly pointing to March 20th as the first day. 

Even calling so much attention to the first day, as if suddenly the northern climes of the contiguous United States would suddenly return their fauna and flora to their other half-lives on that day, is horribly misleading. Assuming that the meteorologists were well aware that spring weather data begins on March 1st of each year in the U.S., the next-most plausible explanation may be found in the lazy assumption that it is easier to go with popular misconceptions than expend the effort to stare one in the face and overcome its stolid inertia head-on (the excuse being not wanting to cause confusion).


As a result, Americans are left with the incredibly incongruent “expert” assertion that summer begins not with Memorial Day, but just a couple of weeks before July 4thon June 21st of each year. Essentially, we are to believe that summer begins in the middle of summer! That such a logical and experiential absurdity can long endure in spite of evidence to the contrary is itself evidence of just how much cognitive dissidence human beings are willing to endure in the face of declarations from perceived expertise. In other words, an erroneous or outdated status-quo societal default has tremendous hold even in the age of (rationalist) Enlightenment (i.e., from the fifteenth-century Renaissance period).


Lest it be said that the enabled popular misconception came spontaneous out of nothing ex nihilo, the basis of the confusion lies in the rather stupid decision to apply the names of the meteorological seasons (i.e., fall, winter, spring, and summer) to the four quadrants of the Earth’s orbit around the sun. Whereas the meteorological seasons are based on the annual temperature cycle applied to the calendar, “astronomical seasons are based on the position of the Earth in relation to the sun.”[2]Due to the tilt of the planet, solar energy is maximized in the Northern and Southern Hemispheres in different parts of the planet’s orbit. To label a certain interval of space as “spring” is not just highly misleading; the label is a category mistake, for the climatic seasons on Earth do not exist in the void of space.[3]


Astronomy is distinct from weather, even though the two are related (i.e., not disparate).
(Image source: NASA)

Put another way, astronomical “spring” in the Northern Hemisphere refers to the portion of the Earth’s orbit from the point at which the vertical rays from the Sun hit the Earth on its equator (on the “Spring” Equinox, usually on March 21st) to point when the vertical rays are on the Tropic of Capricorn (the furthest north the vertical rays go, on the “Summer” Solstice, usually on June 21st). In fact, Summer Solstice is better translated as the highpoint rather than beginning of summer. That is to say, the sun reaches its highest arc in the Northern sky on June 21st, which is neither the pinnacle nor beginning of summer in terms of temperatures.[4]
 

In short, the piercing pronouncements on the public air-waves of the beginning of spring (and then three months later of summer) ring hollow. Nevertheless, the meteorologists who trumpet the good news do so year after year, as if deer caught in a car’s headlights (or speaking to such deer!). Perhaps the fix is as simple as changing the names of the Earth’s orbit’s four parts so they are not likened to climatic seasons. The puzzle would doubtless still present itself as to how it is that nonsensical claims can long endure as a societal (or even global) default, taken for granted in a way that strangely wards off reason’s piercing rays and those of our own experience. Something is oddly off in how human beings are hard-wired.



[1] National Climatic Data Center, “Meteorological Versus Astronomical Summer—What’s the Difference?” National Oceanic and Atmospheric Administration, June 21, 2013 (accessed March 9, 2014).

[2]Ibid., italics added.

[3] As another example of a mislabeling that should have been known to trigger much confusion and even false claims, the three law instructors from Harvard who founded the law school at the University of Chicago at the beginning of the twentieth century should have known better than to replace the name of the bachelors in law, the L.L.B. (i.e., bachelors in the letters of law), with a name implying a doctorate (the J.D., or juris doctor). The actual (professional and academic) doctorate in Law is the J.S.D., the doctorate in juridical science, of which the LL.B., or J.D., along with the LL.M. (masters), is a prerequisite and thus not possibly a doctorate in itself. A doctoral degree must be the terminal degree in a school of knowledge, have comprehensive exams in a discipline of said knowledge (graded by professors rather than an industry regulatory body), and include a significant work of original research (i.e., a book-length study, except in a quantitative or scientific field) that the candidate defends before a committee of faculty. Yet how many Americans correct an American lawyer who declares himself to be a doctor?  The same goes for the M.D. as well (a program of survey-courses followed by a couple years of seminars—the typical substance of a bachelors program), and yet how many physicians and surgeons presume themselves entitled to use the doctoral title (Dr.) even as they dismiss the valid appellations that holders of the Ph.D., J.S.D., D.Sci.M. (Doctorate in the Science of Medicine), D.B.A. (business), D.D. (divinity/theology), and D. Ed. (education) use as per the rights and privileges of these doctoral degrees?  Meanwhile, the general public goes on grazing as if the snow were green grass.

[4]The word solstice in English comes from the Latin word, solstitium, which combines sol (sun) and stit (from sistere, to make stand).  In other words, the sun is made to stand (highest) in the Northern Hemisphere on June 21st of each year. Nothing is thus implied about any beginning; rather, the implication is that of a pinnacle or high point. Yet even in this sense, meteorological summer is different, for its high point in terms of temperature comes in mid to late July. 

Continue reading Meteorology vs. Astronomy: Is It Spring Yet?