As I have stated in an earlier blog post, and as anyone who knows me will attest, I am a voracious consumer of periodicals. My Esquire habit is legendary. I have maintained long-term relationships with a host of other magazines. I enjoy sitting down to the Sunday paper printed on actual paper. I never met a newsstand I didn't like. And now, as the Internet progresses through its second decade as a mainstream media outlet while the American economy grinds down on consumers, advertisers and publishers, the roster of periodicals is shrinking at an alarming pace with some venerable names among the casualties. Last week, my postal carrier delivered the final edition of Gourmet. Over the past month much has been made of the demise of the grande dame of culinary periodicals. One of the more recent articles on the subject can be found here http://www.pbs.org/mediashift/2009/10/did-the-web-kill-gourmet-magazine299.html#comment-161299.
The author makes a number of very valid points regarding Gourmet's dwindling utility – more home epicures are downloading recipes straight from the net, printing them out for use or even bringing their laptops directly to the counter; the recipes themselves were not practical for more austere times and tastes, the net provides multiple opportunities to seek advice and information from specialists within narrow culinary fields; the companion website(s) didn't fulfill the consumers' social media needs as well as other sites. I don't disagree with any of these points. Apparently Conde Nast didn't have any answer for them either, ergo, “au revoir Gourmet.” But I don't believe that Gourmet's fate should be viewed as a sign that print magazines are best consigned to the recycling bin of history.
I am adapting to the digital age. I have the Times Reader installed on my laptops. I read any number of other periodicals that have taken the time and energy to create Blackberry-legible sites. I have flirted with the notion of owning a Kindle, Nook or whatever Plastic Logic decides to name their e-reader. I readily admit that the delivery of information is greatly enhanced by many of the features available on Web 2.0. Hyperlinks, embedded video, RSS, sharing via social media are all, to borrow a term from my 80's post-adolescence, awesome. They bring a vividness to the consumer that classical print media will never be able to replicate. However, electronic media's imminently more environmentally friendly and efficient consumer experience, is also the source of one of my biggest problems with it. When it's off your screen, there's no telling whether it will ever be seen again. Sure, a number of sites archive their materials, but it's just not the same as being able to look back at an old magazine (for those like me who happen to have separation issues with their back issues) where you always know the article is going to be there.
And then there's the issue of the ads. So long as web-based publications remain (largely) free to the consumer, there will be ads. But these ads are generally designed to be portals – eye-catchers intended to entice the consumer to click through to the advertiser's main site. Magazine ads are forced to make their entire sales pitch on the page, leaving little for the consumer to do other than invest a few seconds perusing whatever the message happens to be. Therefore, these ads are reasonably calculated to capture the consumer based upon what the advertiser (and the good folks at Sterling Cooper or whomever has their account) believe is important and engaging at that particular moment in time. Accordingly, print ads are like snapshots of society and magazines the shoebox in the closet that holds them all, even if it's only dragged out at Thanksgiving.
Can print media be minimized? Certainly. There are many areas of opportunity for information to be dispensed in a more efficient, enviro-friendly manner. Does this warrant a clarion call for the entire print media industry to move toward a digital format? Absolutely not. Magazines and other forms of print media are too valuable a history-freezing asset to discard in their entirety. The challenge, of course, is to preserve certain forms of print media in such a way as to justify the expense of production as well as the value to the consumer who may be forced to pay a significantly higher premium (no more $8 annual subscription rates for a magazine that retails for $4.95 an issue at the newsstand) for the privilege of portable media that doesn't require a battery or an Internet connection. Magazines are windows (note the lower case “w”) unto society which, like any other classic architectural element, deserve some degree of preservation as technology paves the way for new and more sophisticated structures.
Tuesday, October 27, 2009
Monday, August 10, 2009
Thursday, July 23, 2009
While you were out shoe shopping, Amazon . . .
This has me excited: http://twurl.nl/cb2e0o
As my lovely wife will readily attest, I have been flirting aggressively with, if not blatantly, outright lusting for a Kindle. As with most gizmos in their early releases, I have maintained a disciplined approach – waiting until the technology moves a little bit higher and the price points a little bit lower. The rollout of the Plastic Logic Reader, however, may put that discipline to a test. What excites me most about this latest iteration of the e-reader is not just its ability to present .pdf documents in a handheld form far superior to the microdocs I can view on my Blackberry, but its promised, full-blown wi-fi platform utilizing AT&T’s 3G network. What this means for me is that I can not only carry around my usual inventory of newspapers, magazines (and books, too), but I can also receive and read documents and access other resources via the web. That capacity alone converts this particular e-reader from merely a fun and convenient device to a tool for which I have legitimate business use. And that means it becomes something I can write off as a business expense. I suspect I am not the only one who is thinking along these lines and that is why I think that this development instantly raises the stakes in the e-reader market.
Your move Imelda, er, Amazon.
As my lovely wife will readily attest, I have been flirting aggressively with, if not blatantly, outright lusting for a Kindle. As with most gizmos in their early releases, I have maintained a disciplined approach – waiting until the technology moves a little bit higher and the price points a little bit lower. The rollout of the Plastic Logic Reader, however, may put that discipline to a test. What excites me most about this latest iteration of the e-reader is not just its ability to present .pdf documents in a handheld form far superior to the microdocs I can view on my Blackberry, but its promised, full-blown wi-fi platform utilizing AT&T’s 3G network. What this means for me is that I can not only carry around my usual inventory of newspapers, magazines (and books, too), but I can also receive and read documents and access other resources via the web. That capacity alone converts this particular e-reader from merely a fun and convenient device to a tool for which I have legitimate business use. And that means it becomes something I can write off as a business expense. I suspect I am not the only one who is thinking along these lines and that is why I think that this development instantly raises the stakes in the e-reader market.
Your move Imelda, er, Amazon.
Wednesday, July 22, 2009
More Moon Musings
The other day writer Tom Junod stated that he “does not think humans will step foot again on the Moon, much less Mars, not just what’s left of (my) lifetime, but ever. Sort of a bummer, put like that, but it just emphasizes what an unlikely miracle the original walk was.” I must say that I find myself pretty well aligned with his sentiments. Notwithstanding NASA’s stated intent to move forward with the Constellation project (which is supposed to return a four astronaut crew to the Moon by 2020), my skeptic’s sense is that on January 1, 2021, Gene Cernan will remain the last person to have walked the lunar surface.
Constellation was announced in 2004 which means that the project was given a 16 year timeline. Keep in mind that the U.S. space program took less than 12 years to go from launching its first satellite to Neil Armstrong’s giant leap for mankind. For the past 30 years, the program has focused on the Space Shuttle, a craft which reaches altitudes roughly 240 miles off the surface of the Earth. Keep in mind the Apollo astronauts had to travel nearly 1000 times that distance in order to reach the moon. Although we keep our astronauts in space far longer than we did during the Apollo days, we sure don’t send them very far. And I have to wonder if the reasons there hasn’t been more of a push to send people back to the Moon or a more aggressive emphasis on sending people to Mars over the last 30 years is because the risk vastly outweighs the utility. Did the Bush administration feel that a new manned moon mission would once again electrify the country as Kennedy’s push for lunar exploration did in the 1960’s? Different times; much different mindsets.
In the era of rapidly unfolding technology – especially in the areas of robotics, artificial intelligence and nanotechnology, it seems to me – admittedly a very unscientific person – that unmanned missions to the Moon and Mars (assuming these missions have cost-sensible value in the first place) are the least risky and thus the most likely way to move forward with space exploration. I cannot believe that something drastic has changed between 1974 (when NASA officially ditched plans for the final four proposed manned moon landings) and 2004 that would compel us to get back to the Moon. Furthermore, given the fact that we have exponentially better technological knowledge and resources at our disposal than we did in 1958, I am struggling with the reasons why it would take us nearly 125% longer to get a human back to the Moon than it did over 50 years ago when we were starting virtually from scratch. Perhaps I am missing something here.
Mars? Not a chance. The Mars Rover program has been very successful, however it’s important to keep in mind that those buggies landed on the heels of four colossal failures – the Mars Observer which fell out of radio contact in 1993 and was presumed lost upon approach to the planet, the Mars Climate Orbiter which burned up in the Martian atmosphere in 1999 and the Mars Polar Lander and Deep Space 2 probe which both crashed into the planet later that same year. The overall landing success rate for unmanned Mars missions is so poor that there is simply no way to justify sending human beings there anytime in the foreseeable future. Not in our lifetime and likely not in any forthcoming lifetimes. If it is going to take 16 years just to repeat what we were able to do 40 years ago with less than a cell phone’s worth of computing power then I have my doubts that we’ll be able to swing a trip of anywhere from 150 to 1000 times the distance of the Apollo missions anytime soon.
If there is value to be had from additional studies of materials on the surfaces of our two closest, non gas-choked heavenly neighbors, then our focus should be on getting as many research tools up there as possible and getting to work. But, rather than focusing on how to get those tools up there and also figuring out how to simultaneously sustain human life and ferrying it back to Earth, we should concentrate on durable and reliable unmanned methods of utilizing those tools.
In the mean time, it’s nice to see we have made significant advances in the area of space-toilet repair. Baby steps, America.
Constellation was announced in 2004 which means that the project was given a 16 year timeline. Keep in mind that the U.S. space program took less than 12 years to go from launching its first satellite to Neil Armstrong’s giant leap for mankind. For the past 30 years, the program has focused on the Space Shuttle, a craft which reaches altitudes roughly 240 miles off the surface of the Earth. Keep in mind the Apollo astronauts had to travel nearly 1000 times that distance in order to reach the moon. Although we keep our astronauts in space far longer than we did during the Apollo days, we sure don’t send them very far. And I have to wonder if the reasons there hasn’t been more of a push to send people back to the Moon or a more aggressive emphasis on sending people to Mars over the last 30 years is because the risk vastly outweighs the utility. Did the Bush administration feel that a new manned moon mission would once again electrify the country as Kennedy’s push for lunar exploration did in the 1960’s? Different times; much different mindsets.
In the era of rapidly unfolding technology – especially in the areas of robotics, artificial intelligence and nanotechnology, it seems to me – admittedly a very unscientific person – that unmanned missions to the Moon and Mars (assuming these missions have cost-sensible value in the first place) are the least risky and thus the most likely way to move forward with space exploration. I cannot believe that something drastic has changed between 1974 (when NASA officially ditched plans for the final four proposed manned moon landings) and 2004 that would compel us to get back to the Moon. Furthermore, given the fact that we have exponentially better technological knowledge and resources at our disposal than we did in 1958, I am struggling with the reasons why it would take us nearly 125% longer to get a human back to the Moon than it did over 50 years ago when we were starting virtually from scratch. Perhaps I am missing something here.
Mars? Not a chance. The Mars Rover program has been very successful, however it’s important to keep in mind that those buggies landed on the heels of four colossal failures – the Mars Observer which fell out of radio contact in 1993 and was presumed lost upon approach to the planet, the Mars Climate Orbiter which burned up in the Martian atmosphere in 1999 and the Mars Polar Lander and Deep Space 2 probe which both crashed into the planet later that same year. The overall landing success rate for unmanned Mars missions is so poor that there is simply no way to justify sending human beings there anytime in the foreseeable future. Not in our lifetime and likely not in any forthcoming lifetimes. If it is going to take 16 years just to repeat what we were able to do 40 years ago with less than a cell phone’s worth of computing power then I have my doubts that we’ll be able to swing a trip of anywhere from 150 to 1000 times the distance of the Apollo missions anytime soon.
If there is value to be had from additional studies of materials on the surfaces of our two closest, non gas-choked heavenly neighbors, then our focus should be on getting as many research tools up there as possible and getting to work. But, rather than focusing on how to get those tools up there and also figuring out how to simultaneously sustain human life and ferrying it back to Earth, we should concentrate on durable and reliable unmanned methods of utilizing those tools.
In the mean time, it’s nice to see we have made significant advances in the area of space-toilet repair. Baby steps, America.
Tuesday, July 21, 2009
To the Moon, Walter!
Pardon the pun, but I marvel at the cosmic happenstance of Walter Cronkite dying a mere three days before the 40th anniversary of man’s first steps on the moon. Between coverage of these two events, I’d venture to guess that there hasn’t been that much black and white footage floating around the airwaves at any one time (outside of TCM, that is) since the Nixon Administration. The broadcast remembrances of Uncle Walter and the celebrations of the Apollo 11 mission which have aired over the last several days brought back quite a few memories from my youth (granted, I was only 1 during the first Apollo moon landing, but I certainly remember a couple of the latter moon shots). It also caused me to reflect on the fact that mine is the last generation that will remember Walter Cronkite as America’s anchorman and will also likely be the last generation that will remember when a human being set foot on the moon. I will address that last point in a subsequent blog entry. Today I would like to talk about the fact that I am apparently getting old if not downright crotchety and the reason why I say this has much to do with the death of Walter Cronkite and with how today’s young adults may perceive those glorious days of July, 1969.
This past Monday I was listening to a couple of radio personalities in their late 20’s, early 30’s talking about the death of Walter Cronkite (whose name they continued to mispronounce “Conkrite.”) It was fairly clear that they really didn’t have a full understanding of who he was and what he meant to American journalism at a very pivotal point in its history. They were trying to give some sort of respect to the man whose name they had doubtless heard from their parents and elders but for whom they had no recognition beyond that. When I was a kid, Walter Cronkite was a familiar face at our nightly dinner table. I suspect he was the dinner guest of many American households in the 60’s and 70’s. Back in those days you had a choice between CBS, NBC and ABC for your nightly news. If you wanted to go “unconventional” in your television news viewing, you waited a half hour and watched MacNeil/Lehrer on PBS (and that option didn’t even present itself until the mid 70’s). For the better part of his 19 year run at the desk of the CBS Evening News, Cronkite owned the airwaves. This dominance was all the more remarkable when examining the luminaries with whom he was competing for viewership in a small and undiluted market. He managed to beat out Chet Huntley, David Brinkley and John Chancellor at NBC. Even more remarkable was the parade of nine different anchors and co-anchors that ABC tried to put up against the man, a group which included Howard K. Smith, Harry Reasoner, Frank Reynolds and Peter Jennings (twice). I really don’t know how to provide a modern day analogue for Walter Cronkite. That makes it very difficult to explain to someone under the age of, say, 30, exactly how Walter Cronkite influenced us, how someone could actually earn the title “Most Trusted Man in America” in an era when the country was in the clutches of phenomenal distrust and cynicism. We have reached a point where information is transmitted in real-time via the Internet. It has reached the point in many cases where the term “news” is almost an oxymoron. I am reluctant to call the transmission of unfiltered, unverified information “news.” In my lexicon, “news” is a consumable commodity like food. And like food, it should be carefully inspected and rid of impurities before serving. And the server should be able to take pride in what he or she is presenting, knowing full well that what is brought to the table has been prepared properly and is of the highest possible quality. It saddens me to think back to just a few years ago when Dan Rather, Cronkite’s successor at the CBS Evening News, was disciplined for failing to live up to this relatively basic standard when delivering an investigative report into former president George W. Bush’s service in the military reserves. Talk about a squandered ineritence!
So, how to describe Walter Cronkite to someone who simply cannot envision a world without the internet and with only three choices in television news? I don’t know that it can be done. After all, we are living in challenging if not outright frightening times this very day. We have our own Vietnam with which to contend. We have economic strife just as bad, if not much worse, than that which this country endured during Cronkite’s watch. There is concern we are on the eve of what could be a devastating influenza pandemic. There is constant sturm und drang in Washington, perhaps not on the scale of Watergate, but certainly myriad reasons to distrust what our nation’s leaders are up to at any given time. The problem of distrust is certainly raging in a handful of state capitals throughout the country. To whom are we turning en masse every night to walk us through it? Jon Stewart? Perez Hilton? The answer is, there is nobody with whom we can possibly compare Walter Cronkite. Therefore, Walter Cronkite should remain the standard against whom all broadcast journalists should be measured, and that means it is essential that his work and his ethics continue to be studied and emulated.
And now, to the moon, or at least the topic of Apollo 11. On the same day that one radio show was discussing the passing of Walter Cronkite, another radio show’s on-air staff was discussing the planting of the American flag on the moon and their impressions that it was an act of arrogance. I found this to be an astounding observation which lacked historical perspective. What was apparently being overlooked (or perhaps not even considered) was the fact that the USA of 1969 was a much different nation than the USA of 2009. The planting of an American flag on the moon’s surface represented far more than some sort of a galactic imperialistic mindset or even that of global superiority.
What the young hosts of the radio program did not take into account was that fact that on July 20, 1969, the United States was a nation divided by many generational, racial and political factions. We were still licking our wounds from a scarring 1968. The moon landing gave most of the country an opportunity to pause for a moment and feel good about itself. While there were certainly critics of the program at that time, it was pretty clear that most of the nation took extreme pride in their country if only for that day. Arrogant? I don’t view it that way. Nobody ever (seriously or officially) attempted to “claim” the moon in the name of the United States. It was simply a way to mark that we were there. As a matter of fact, the plaque that the Apollo 11 astronauts left on the moon states, "Here men from the planet Earth first set foot upon the Moon July 1969, A.D. We came in peace for all mankind." Given the fact that the USA was embroiled in a space race with the Soviet Union at the time of the moon landing, the words which were chosen to commemorate our nation winning that enormous leg of the race were not the least bit boastful when clearly the opportunity existed to exploit the moment.
We should not be quick to confuse national pride with arrogance. There is no shame in taking pride in the great accomplishments of our country and our fellow citizens. Like all facets of our lives, we need to find an appropriate balance and temper our behaviors accordingly. Likewise, we should not jump to knee-jerk conclusions that every flag-waving opportunity is an effort to cram America down the throats of others or ourselves. If and when the time ever comes to close the book on the United States, there is a significant chance that the moon landing will stand out as among its greatest, if not the greatest, achievement. We should not tarnish that memory by making it out to be anything other than what it truly was – a great day to be an American. And that’s perfectly okay.
This past Monday I was listening to a couple of radio personalities in their late 20’s, early 30’s talking about the death of Walter Cronkite (whose name they continued to mispronounce “Conkrite.”) It was fairly clear that they really didn’t have a full understanding of who he was and what he meant to American journalism at a very pivotal point in its history. They were trying to give some sort of respect to the man whose name they had doubtless heard from their parents and elders but for whom they had no recognition beyond that. When I was a kid, Walter Cronkite was a familiar face at our nightly dinner table. I suspect he was the dinner guest of many American households in the 60’s and 70’s. Back in those days you had a choice between CBS, NBC and ABC for your nightly news. If you wanted to go “unconventional” in your television news viewing, you waited a half hour and watched MacNeil/Lehrer on PBS (and that option didn’t even present itself until the mid 70’s). For the better part of his 19 year run at the desk of the CBS Evening News, Cronkite owned the airwaves. This dominance was all the more remarkable when examining the luminaries with whom he was competing for viewership in a small and undiluted market. He managed to beat out Chet Huntley, David Brinkley and John Chancellor at NBC. Even more remarkable was the parade of nine different anchors and co-anchors that ABC tried to put up against the man, a group which included Howard K. Smith, Harry Reasoner, Frank Reynolds and Peter Jennings (twice). I really don’t know how to provide a modern day analogue for Walter Cronkite. That makes it very difficult to explain to someone under the age of, say, 30, exactly how Walter Cronkite influenced us, how someone could actually earn the title “Most Trusted Man in America” in an era when the country was in the clutches of phenomenal distrust and cynicism. We have reached a point where information is transmitted in real-time via the Internet. It has reached the point in many cases where the term “news” is almost an oxymoron. I am reluctant to call the transmission of unfiltered, unverified information “news.” In my lexicon, “news” is a consumable commodity like food. And like food, it should be carefully inspected and rid of impurities before serving. And the server should be able to take pride in what he or she is presenting, knowing full well that what is brought to the table has been prepared properly and is of the highest possible quality. It saddens me to think back to just a few years ago when Dan Rather, Cronkite’s successor at the CBS Evening News, was disciplined for failing to live up to this relatively basic standard when delivering an investigative report into former president George W. Bush’s service in the military reserves. Talk about a squandered ineritence!
So, how to describe Walter Cronkite to someone who simply cannot envision a world without the internet and with only three choices in television news? I don’t know that it can be done. After all, we are living in challenging if not outright frightening times this very day. We have our own Vietnam with which to contend. We have economic strife just as bad, if not much worse, than that which this country endured during Cronkite’s watch. There is concern we are on the eve of what could be a devastating influenza pandemic. There is constant sturm und drang in Washington, perhaps not on the scale of Watergate, but certainly myriad reasons to distrust what our nation’s leaders are up to at any given time. The problem of distrust is certainly raging in a handful of state capitals throughout the country. To whom are we turning en masse every night to walk us through it? Jon Stewart? Perez Hilton? The answer is, there is nobody with whom we can possibly compare Walter Cronkite. Therefore, Walter Cronkite should remain the standard against whom all broadcast journalists should be measured, and that means it is essential that his work and his ethics continue to be studied and emulated.
And now, to the moon, or at least the topic of Apollo 11. On the same day that one radio show was discussing the passing of Walter Cronkite, another radio show’s on-air staff was discussing the planting of the American flag on the moon and their impressions that it was an act of arrogance. I found this to be an astounding observation which lacked historical perspective. What was apparently being overlooked (or perhaps not even considered) was the fact that the USA of 1969 was a much different nation than the USA of 2009. The planting of an American flag on the moon’s surface represented far more than some sort of a galactic imperialistic mindset or even that of global superiority.
What the young hosts of the radio program did not take into account was that fact that on July 20, 1969, the United States was a nation divided by many generational, racial and political factions. We were still licking our wounds from a scarring 1968. The moon landing gave most of the country an opportunity to pause for a moment and feel good about itself. While there were certainly critics of the program at that time, it was pretty clear that most of the nation took extreme pride in their country if only for that day. Arrogant? I don’t view it that way. Nobody ever (seriously or officially) attempted to “claim” the moon in the name of the United States. It was simply a way to mark that we were there. As a matter of fact, the plaque that the Apollo 11 astronauts left on the moon states, "Here men from the planet Earth first set foot upon the Moon July 1969, A.D. We came in peace for all mankind." Given the fact that the USA was embroiled in a space race with the Soviet Union at the time of the moon landing, the words which were chosen to commemorate our nation winning that enormous leg of the race were not the least bit boastful when clearly the opportunity existed to exploit the moment.
We should not be quick to confuse national pride with arrogance. There is no shame in taking pride in the great accomplishments of our country and our fellow citizens. Like all facets of our lives, we need to find an appropriate balance and temper our behaviors accordingly. Likewise, we should not jump to knee-jerk conclusions that every flag-waving opportunity is an effort to cram America down the throats of others or ourselves. If and when the time ever comes to close the book on the United States, there is a significant chance that the moon landing will stand out as among its greatest, if not the greatest, achievement. We should not tarnish that memory by making it out to be anything other than what it truly was – a great day to be an American. And that’s perfectly okay.
Monday, July 13, 2009
Food for Thought
On a day when much of the focus in Washington was on Supreme Court nominee Judge Sonia Sotomayor, President Obama decided to roll out yet another of his administrative appointees. And when I say “roll out” I mean no disrespect to the visibly portly Dr. Regina Benjamin, Surgeon General designee. After all, I am roughly 50 pounds over what I am told is an acceptable weight for my height. But then again, I have not been nominated by the President of the United States of America to be the public face of American health.
It doesn’t take much research to come to the conclusion that ours is an obese nation, and this appointment shows absolute fidelity to our President’s promise to craft an administration that “looks like America.” Unfortunately, we’ve been told over and over again that America could stand to lose a few pounds and obesity is the gateway to a vast array of debilitating if not downright lethal health conditions. Will an overweight Surgeon General be taken seriously when she addresses the issue of obesity , arguably our most urgent – and curable – health crisis? Moreover, why isn’t diet and nutrition at the forefront of the government’s discussion of health care and general wellness? We have a President’s Council on Physical Fitness and Sports, however we have no President’s Council on Diet and Nutrition. Not only are the terms “diet” and “nutrition” absent from the Council’s name, it would appear they are also absent from its agenda. A recent visit to the Council’s website http://www.fitness.gov revealed little in the way of healthy eating tips other than nutritional issues directly related to exercise. Diet and nutrition are for some unexplained reason the province of the Department of Agriculture – a Cabinet-level body, the director of which actually holds a superior position to that of Surgeon General. However, my research shows that it has been nearly three years since the Secretary of Agriculture has addressed the issue of diet and nutrition in a public broadcast forum. Wouldn’t it make sense to combine the oversight of matters concerning diet and exercise within the same governmental entity? Wouldn’t the USDA be better served by allowing it to concentrate on food production and safety issues, especially in this era of monthly meat recalls, e-coli and salmonella scares? At this moment, the Obama administration is in the process of cleaving a number of consumer protection functions from the Federal Trade Commission and creating a new, more specialized agency. To me, it seems to be just as logical to do the same thing with those departments and sub-departments that address dietary and physical fitness matters. Don’t confuse this notion with my advocacy for big government telling us how to eat and exercise. However, these operations are already in existence and there is no reason to believe that they are in any jeopardy of being eliminated. There may also be some opportunities for expense saves and improved efficiencies though reorganization. Perhaps a more coordinated, comprehensive effort to promote good exercise and eating habits which is subject to singular oversight would help make a dent – any dent – in a number of the symptoms of our national health care crisis.
In the mean time, the proposed face of American health has a double chin. I cannot imagine the scorn that would be heaped upon the Senator who would call her weight into question during the confirmation process. I do believe that it would be appropriate to ask her how she intends to address the issue of diet and exercise as the avenue to improved American wellness. I would love for her response to include a strategy in which she leads by example. Perhaps that’s asking too much of my governmental leaders but I’m still erring on the side of optimism.
Pass the Doritos.
It doesn’t take much research to come to the conclusion that ours is an obese nation, and this appointment shows absolute fidelity to our President’s promise to craft an administration that “looks like America.” Unfortunately, we’ve been told over and over again that America could stand to lose a few pounds and obesity is the gateway to a vast array of debilitating if not downright lethal health conditions. Will an overweight Surgeon General be taken seriously when she addresses the issue of obesity , arguably our most urgent – and curable – health crisis? Moreover, why isn’t diet and nutrition at the forefront of the government’s discussion of health care and general wellness? We have a President’s Council on Physical Fitness and Sports, however we have no President’s Council on Diet and Nutrition. Not only are the terms “diet” and “nutrition” absent from the Council’s name, it would appear they are also absent from its agenda. A recent visit to the Council’s website http://www.fitness.gov revealed little in the way of healthy eating tips other than nutritional issues directly related to exercise. Diet and nutrition are for some unexplained reason the province of the Department of Agriculture – a Cabinet-level body, the director of which actually holds a superior position to that of Surgeon General. However, my research shows that it has been nearly three years since the Secretary of Agriculture has addressed the issue of diet and nutrition in a public broadcast forum. Wouldn’t it make sense to combine the oversight of matters concerning diet and exercise within the same governmental entity? Wouldn’t the USDA be better served by allowing it to concentrate on food production and safety issues, especially in this era of monthly meat recalls, e-coli and salmonella scares? At this moment, the Obama administration is in the process of cleaving a number of consumer protection functions from the Federal Trade Commission and creating a new, more specialized agency. To me, it seems to be just as logical to do the same thing with those departments and sub-departments that address dietary and physical fitness matters. Don’t confuse this notion with my advocacy for big government telling us how to eat and exercise. However, these operations are already in existence and there is no reason to believe that they are in any jeopardy of being eliminated. There may also be some opportunities for expense saves and improved efficiencies though reorganization. Perhaps a more coordinated, comprehensive effort to promote good exercise and eating habits which is subject to singular oversight would help make a dent – any dent – in a number of the symptoms of our national health care crisis.
In the mean time, the proposed face of American health has a double chin. I cannot imagine the scorn that would be heaped upon the Senator who would call her weight into question during the confirmation process. I do believe that it would be appropriate to ask her how she intends to address the issue of diet and exercise as the avenue to improved American wellness. I would love for her response to include a strategy in which she leads by example. Perhaps that’s asking too much of my governmental leaders but I’m still erring on the side of optimism.
Pass the Doritos.
Friday, July 3, 2009
Et Tu, Nokia?
I’m funny when it comes to the seasons. As I write this we are two days into the month of July, not even a fortnight past the summer solstice and yet I am already thinking about autumn. Yes, technically the days are now getting shorter, however the thermometer here in St. Louis continues to register temperatures more akin to the threshold for the neutralization of salmonella than those that would inspire me to call up Autumn Serenade on the iPod (from the album “John Coltrane and Johnny Hartman” I should add). Perhaps it’s because the Canadian Football League season kicked off this week. Perhaps it’s because I received a fire pit from my wife and kids for Father’s Day and can’t wait until there is a justifiably cool evening to use it. Perhaps it’s because I’m just not a huge fan of summer. Whatever the reason, I am already thinking about autumn. And because I am thinking about autumn, it comes as no surprise whatsoever that I am also thinking that cell phones are a contributing factor to the present state of the U.S. economy.
Makes, sense, yes?
In the event I may have lost one or two of you, please allow me to explain. The advent of autumn marks that time of year when we all start receiving the chain email that talks about this year’s incoming college freshmen and all of the things that they take for granted that we are probably now just getting around to discovering how to use without instructions from a Complete Idiot’s Guide or a ten year old. Well, I really don’t want to talk about them today. I want to talk about this year’s incoming college seniors, those individuals who, for the most part, are going to be spending one final, glorious year enriching their minds and perhaps damaging their livers prior to walking straight off a plateau, Wile E. Coyote style, straight into the abyss that will be the 2010 employment market. All of the talk of “green shoots” and sustained stock market rallies which consumed much of the last six weeks seems to be receding once more in favor of reports of higher than anticipated jobless claims. We have yet to see the true fallout and residue from the GM and Chrysler bankruptcies. A number of banks once thought to be on the mend are once again starting to send out distress signals. We are not out of the woods and may actually still be working our way into the thick of it. In any event, even an imminent reversal to the present situation won’t gain traction in time to save the Class of 2010. Those poor neophytes are going to find themselves competing for entry level positions with the more experienced last-in/first-out layoff victims from other companies and the super-talented but non-capitalized folks who possess vastly superior skills but no access to startup cash. Anyone out there who wants to lament about how desperately they wish they were still 21 should catch an episode of CNBC’s “On the Money” (sadly demoted from a weeknight staple to a Saturday night dead-zone filler).
But let’s not focus so much on the future of the collegiate Class of 2010. Let’s take a look at where they’ve been. Assuming that the average college senior was born in 1987 or 1988, we should take a peek at the world in which they came to maturity. I am unscientifically pegging the year 1994 as the year in which cell phones became a mainstream staple. I say that for a couple of reasons. First, that’s the year I got my first cell phone, and second, that is when the (relatively) compact handhelds really started to outnumber their antennae’d brick predecessors. Don’t look for footnotes; I’m winging it. The advent of the truly portable, relatively affordable and unobtrusive cell phone changed our social dynamic in a radical fashion. I submit that Cell Phones 2.0 (we shall consider the black bag and car console mounted era Cell Phones 1.0) represented an untethering of American society to a degree not achieved since Henry Ford and his ilk made automobiles affordable and plentiful. In 1994 most of these present-day college seniors would have been in first grade. As these youngsters moved through the ranks of elementary school, cellular phone technology advanced at a disproportionately higher rate. By the time the kiddos had reached middle school, cell phone and electronic mail technologies had merged like some kind of high-tech Reese’s Peanut Butter Cup, providing consumers with opportunities to communicate in oral or written form anywhere that they could get a signal. Once our soon-to-be job hunting friends were getting their driver’s licenses it was not uncommon for cell phones to provide a gateway to the internet. From this point forward, college graduates will be from generations who have never had a need to stay put for any reason.
Before I’m cast as some sort of a cave-dwelling Howard Cunningham, entirely content on maintaining a home-work-home routine (with an occasional Leopard Lodge meeting thrown in for good measure), I have no issue whatsoever with the liberation that cell phones have provided. They are an incredible convenience and have undoubtedly saved many lives in emergencies. And I will be the first to admit that it’s nice to be able to get out of the office without forgoing access to important calls, emails and documents. As a matter of fact, it is for that reason that I believe we are more productive workers.
So where’s the downside? Notwithstanding all of the blessings that cell phones have bestowed upon us, I have to wonder whether the convenience of a super-mobile society is creating more expense. If we no longer have a need to be at home in our free time, we are clearly spending more money. We are buying more gas; we are eating out with more frequency (that $12 grilled chicken Caesar from Il Bagno would cost you about $2 at home); and we are participating in a number of activities for which there are associated expenses. There is nothing inherently wrong with getting out of the house and being an active member of society, however, I often wonder whether we are becoming the victims of our own freedom, getting out and about a little too much – because we can – and spending more money in the process. While this increased consumerism has helped the economy in certain respects, it seems fairly clear that we were spending where we should have been saving and were overusing the plastic.
We are in our second summer of recession and much of the focus on the American consumer has been on the drastic change to spending behaviors. The personal saving rate, long in the negative range, is at a 15 year high of 6.9% as of May, 2009. The term “staycation” is now catalogued by Webster’s and is once again being tossed about the daily news like a beach ball is apparently being tossed around the backyard rather than the oceanfront this year. We are eating out less, going out less, finding more to do around the house (and, more to do to the house, as sales figures at places like Lowe’s would suggest). Perhaps it’s oversimplification, but I think the fact that people consider staying home the avenue to an improved domestic balance sheet is validation that we have been abusing our freedom of enhanced mobility. Just because we can go out doesn’t mean we have to go out. Now I'm sounding like every 16 year-old's father.
And the punch line to this post is that I started writing it on my laptop at a Starbucks in between emails and a couple of calls. I never said I wasn’t part of the problem . . .
Makes, sense, yes?
In the event I may have lost one or two of you, please allow me to explain. The advent of autumn marks that time of year when we all start receiving the chain email that talks about this year’s incoming college freshmen and all of the things that they take for granted that we are probably now just getting around to discovering how to use without instructions from a Complete Idiot’s Guide or a ten year old. Well, I really don’t want to talk about them today. I want to talk about this year’s incoming college seniors, those individuals who, for the most part, are going to be spending one final, glorious year enriching their minds and perhaps damaging their livers prior to walking straight off a plateau, Wile E. Coyote style, straight into the abyss that will be the 2010 employment market. All of the talk of “green shoots” and sustained stock market rallies which consumed much of the last six weeks seems to be receding once more in favor of reports of higher than anticipated jobless claims. We have yet to see the true fallout and residue from the GM and Chrysler bankruptcies. A number of banks once thought to be on the mend are once again starting to send out distress signals. We are not out of the woods and may actually still be working our way into the thick of it. In any event, even an imminent reversal to the present situation won’t gain traction in time to save the Class of 2010. Those poor neophytes are going to find themselves competing for entry level positions with the more experienced last-in/first-out layoff victims from other companies and the super-talented but non-capitalized folks who possess vastly superior skills but no access to startup cash. Anyone out there who wants to lament about how desperately they wish they were still 21 should catch an episode of CNBC’s “On the Money” (sadly demoted from a weeknight staple to a Saturday night dead-zone filler).
But let’s not focus so much on the future of the collegiate Class of 2010. Let’s take a look at where they’ve been. Assuming that the average college senior was born in 1987 or 1988, we should take a peek at the world in which they came to maturity. I am unscientifically pegging the year 1994 as the year in which cell phones became a mainstream staple. I say that for a couple of reasons. First, that’s the year I got my first cell phone, and second, that is when the (relatively) compact handhelds really started to outnumber their antennae’d brick predecessors. Don’t look for footnotes; I’m winging it. The advent of the truly portable, relatively affordable and unobtrusive cell phone changed our social dynamic in a radical fashion. I submit that Cell Phones 2.0 (we shall consider the black bag and car console mounted era Cell Phones 1.0) represented an untethering of American society to a degree not achieved since Henry Ford and his ilk made automobiles affordable and plentiful. In 1994 most of these present-day college seniors would have been in first grade. As these youngsters moved through the ranks of elementary school, cellular phone technology advanced at a disproportionately higher rate. By the time the kiddos had reached middle school, cell phone and electronic mail technologies had merged like some kind of high-tech Reese’s Peanut Butter Cup, providing consumers with opportunities to communicate in oral or written form anywhere that they could get a signal. Once our soon-to-be job hunting friends were getting their driver’s licenses it was not uncommon for cell phones to provide a gateway to the internet. From this point forward, college graduates will be from generations who have never had a need to stay put for any reason.
Before I’m cast as some sort of a cave-dwelling Howard Cunningham, entirely content on maintaining a home-work-home routine (with an occasional Leopard Lodge meeting thrown in for good measure), I have no issue whatsoever with the liberation that cell phones have provided. They are an incredible convenience and have undoubtedly saved many lives in emergencies. And I will be the first to admit that it’s nice to be able to get out of the office without forgoing access to important calls, emails and documents. As a matter of fact, it is for that reason that I believe we are more productive workers.
So where’s the downside? Notwithstanding all of the blessings that cell phones have bestowed upon us, I have to wonder whether the convenience of a super-mobile society is creating more expense. If we no longer have a need to be at home in our free time, we are clearly spending more money. We are buying more gas; we are eating out with more frequency (that $12 grilled chicken Caesar from Il Bagno would cost you about $2 at home); and we are participating in a number of activities for which there are associated expenses. There is nothing inherently wrong with getting out of the house and being an active member of society, however, I often wonder whether we are becoming the victims of our own freedom, getting out and about a little too much – because we can – and spending more money in the process. While this increased consumerism has helped the economy in certain respects, it seems fairly clear that we were spending where we should have been saving and were overusing the plastic.
We are in our second summer of recession and much of the focus on the American consumer has been on the drastic change to spending behaviors. The personal saving rate, long in the negative range, is at a 15 year high of 6.9% as of May, 2009. The term “staycation” is now catalogued by Webster’s and is once again being tossed about the daily news like a beach ball is apparently being tossed around the backyard rather than the oceanfront this year. We are eating out less, going out less, finding more to do around the house (and, more to do to the house, as sales figures at places like Lowe’s would suggest). Perhaps it’s oversimplification, but I think the fact that people consider staying home the avenue to an improved domestic balance sheet is validation that we have been abusing our freedom of enhanced mobility. Just because we can go out doesn’t mean we have to go out. Now I'm sounding like every 16 year-old's father.
And the punch line to this post is that I started writing it on my laptop at a Starbucks in between emails and a couple of calls. I never said I wasn’t part of the problem . . .
Subscribe to:
Posts (Atom)