Posted on 5 August 2014 | No responses
Boston! Another nice statue, in the Public Gardens.
via Tumblr http://ift.tt/UTHaMC
Posted on 5 August 2014 | No responses
Boston! Nice statues.
via Tumblr http://ift.tt/UTHc7a
Posted on 5 August 2014 | No responses
via Tumblr http://ift.tt/UTH9s1
Posted on 20 July 2014 | No responses
via Tumblr http://ift.tt/1rAuKEP
Posted on 8 July 2014 | No responses
I spent probably three hours on the front porch tonight. The first half was spent enjoying a premium cigar and a dram of Scotch, whilst reading the sundry news of the day. The second half was spent on the phone, catching up with my friend Duane.
I was struck by two things.
The minor thing was a door-to-door visit by the incumbent candidate for my district for county commission. She stopped by, we chatted briefly, she moved on. Left a favorable impression — she’s a somewhat middle-of-the-road Democrat. I don’t see much door-to-door campaigning in my area, so her personal touch was appreciated. This area is pretty much a solid Dem lock, so if I have to pick among three Dems for the job, I’ll end up picking the one who actually asked for my vote.
The major thing was the runners.
My neighborhood is infested with mid-to-late 20s grad students and professionals early in their careers. They sometimes party, but never obnoxiously. They run. A lot. I see them all the time from my office, which overlooks the road.
But I noticed that these young, fit things go on “runs” that … well, they’re short. They festoon themselves in tech apparel, hook up their iPhones to their armbands, do their stretching exercises on the sidewalk and look for all the world like they’re about to embark upon a half marathon — and then they’re home in about 15 minutes or less.
In my day, when you went for a run, you ran. When I lived in Kentwood, I’d lace up my shoes, thrice weekly, at 10 or 11 p.m. and wouldn’t get home until after 8 miles ticked off the odometer (54th/Division south to 60th, east to Kalamazoo, north to 44th, then back).
Kids these days. They’re not as tough as they used to be.
Posted on 6 July 2014 | No responses
Fourteen years ago this Thursday, I began employment with Spectrum Health, and this week I celebrate an altogether different kind of opportunity within the company as the result of a promotion. This new direction in my career prompts some reflection on how younger workers get from Point A to Point B.
But first, gather ’round kiddies, because grandpa has a story ….
I applied to Spectrum Health in the spring of 2000 on a bit of a whim, as yet another company to which I could shotgun my resume. Those early working years were a bit chaotic. I started in 1994, at the tender age 16, working for Meijer Inc. as a grocery bagger, eventually moving to roles as a cashier and as a service-desk associate. I worked for the company for five years at two different stores; in the middle, I also spent two years working for the now-defunct Michigan National Bank.
In mid-1997 I left both Meijer and MNB and began a series of gigs with various temp agencies. Some of them were literal day jobs while others (like a year-long stint doing quality assurance for one of Tower Automotive’s metal-stamping plants) had a bit more substance. By early 1999 Frey Foundation hired me out of a temp assignment, but internal restructuring led to my departure in the spring of 2000. Not long thereafter, I began work on the golf course, but that kind of job really isn’t a steady year-round opportunity — not during Michigan winters, anyway.
I was still pursuing my bachelor’s degree so I needed something flexible. At Spectrum Health, the hospital’s Resource Center functions like an internal temp pool, so I was hired in July 2000 to do part-time, on-call secretarial work. I could note my availability and then be scheduled for work within my preferred timeslots. At first, I got short-term assignments: A few weeks doing medical-records filing for Peds General, a few weeks supporting process-improvement initiatives for Periop, etc. I eventually landed a pair of concurrent longer-term assignments, one doing weekend intake for Care Management and the other doing donor-records processing for the hospital’s foundation office.
That Care Management assignment led to a transfer from the part-time/no-benefits job in Resource to a full-time, full-benefits job supporting the department director. Over time, my role with Tracey evolved from secretary to data analyst. By 2006, I was a measurement and evaluation specialist in her area, coordinating various data-analysis efforts related mostly to hospital-based case management and care transitions. I earned the Certified Professional in Healthcare Quality credential that year and joined both the National Association for Healthcare Quality and the American Statistical Association.
Health care, as an industry, isn’t stagnant. Between 2006 and 2012, our division underwent substantial change — with our entire mission and org chart sometimes being rewritten semiannually. But by 2011 I was appointed the team leader of the Revenue Cycle Informatics group, a passel of nine analysts servicing the registration, scheduling and coding areas of the facility.
In 2012 our CFO left and the new guy had different ideas. My team was disbanded, Tracey moved from the hospitals to the medical group, and one colleague and I were involuntarily transferred into the I.T. department to help staff a new business-reporting group. Fiscal 2013 was the year I had six separate formal supervisors in the payroll system. I stuck it out for nearly a year, but by June 2013 I had applied for, and was granted, a position as a medical informatics consultant in the quality-improvement team at Priority Health. PH is Spectrum Health’s insurance arm — same corporate CEO but otherwise a different world altogether.
Last week, my boss promoted me, so when I head into the office tomorrow, it’ll be as the new manager of quality improvement analytics for Priority Health. Six members of the team will report to me and I got a nice little raise out of it.
During my time with Spectrum Health, I’ve enjoyed other work, too — I was a newspaper editor at the Western Herald in the mid 2000s and I’ve run a part-time communications consultancy since 2008. This year, I’ve joined the board of Caffeinated Press, Inc., a local micropublisher of books and (eventually) literary magazines.
Yet a 14-year journey from part-time secretary to department manager in the same large organization isn’t a small thing. As I look at where I’m at right now, I can share several valuable lessons for early-career professionals plotting their own long-term trajectories.
- Be smart about what kinds of work you do in your late teens or early 20s. Why bag groceries or flip burgers when you could get an entry-level or summer job doing something closer to your intended career path? If you are interested in veterinary medicine, work as a “gopher” at the local zoo. If you want to be a dental hygienist, work as a dentist’s receptionist or file clerk. Even if you can’t find an ideal entry-level position, getting something close enough can help differentiate applicants for their first real full-time jobs. As a hiring manager, given two academically similar newly minted statisticians, I’ll hire the one who worked as a data-entry tech for a marketing agency before I’d hire the fry guy from Burger King. Working in many different settings for a temp agency also makes sense — it’ll increase the list of industries and settings you can say that you’ve encountered; this diversity of experience makes for a more well-rounded applicant. It’s never too early to think about the place and nature of your earliest job experiences.
- Expand your resume with your extracurriculars. Things around the periphery of a person’s working life matter. Volunteer. Do exciting things that earn awards. Write stuff that gets published. Earn industry certifications. Invest in hobbies that lead to credentials or outcomes you can note on your resume. I’ve had several discussions with recruiters specifically about my amateur radio license and scuba certifications because they set me apart from others even if they weren’t directly related to the job at hand. Such items aren’t obvious and often overlooked, but they paint a picture of a person who achieves goals outside of the office — a subtle but important signal for employers. Diversify the skills and experiences you can share with employers to fill in the white spaces of an early-career resume.
- You only need to be a half-step smarter than the next smartest person in the room … Always learn more than you need to know about the kinds of work that you do. Experts say that continuous learning matters, and it does — but it matters insofar as you can justifiably claim the expertise to be invaluable to others and to remain current on industry trends. But there’s a point where you can be too smart: Earning a Ph.D for a job that can be done by a B.A. may make it difficult to get your foot in the door. Over-specialization can lead to stereotyping that ultimately leads to a loss of opportunity. Learn enough to be broadly useful but don’t get so specialized that you become a permanent niche player.
- … but being a know-it-all is a surefire career killer. One of the chief lessons I learned from Tracey was that even though I often already knew the answer to a problem within five minutes of starting a meeting, I’d earn more goodwill with colleagues by subtly guiding the conversation and letting someone else claim the “aha! moment” at the end of the discussion. Such a strategy proved more prudent than asserting a solution up front and leaving others to feel embarrassed about not getting there as quickly as I did. Few acts inspire such deep but silent resentment as being made to feel stupid by an overconfident whippersnapper. The smart folks usually nurture understanding within a group, instead of wielding their erudition like a poison-tipped stiletto.
- Keep your commitments. If you say you will do X activity on Y date, do it. Even if blowing the deadline doesn’t matter and even if you have to work late to get stuff done, do it. Younger workers, in broad relative terms, lack the urgency and punctuality of folks currently in senior management ranks. Earning a reputation as being someone who only sporadically meets agreed-upon targets will kill a career almost as fast as a sexual-harassment allegation will. Never fail to meet expectations.
- Fit in with your targeted peer group. Dress and speak the part of your peer cohort. If your intended peer cohort is several ranks higher than you, aim for that level. The guy who wears sloppy jeans and ill-fitting Hawaiian shirts on Casual Friday — even if everyone else at his level does the same — puts himself at a cultural disadvantage if he aspires to break into the group who dress as if Casual Friday is a misguided sop to the underlings. Speak carefully and respectfully, without gossip and without betraying confidences. Think five times before saying something catty. Structure criticism in terms of opportunity instead of defect. Praise others in public and in private. Comport yourself like an executive.
- Never say no. Responses like “that’s not my job” or “sorry, I can’t help you” are never acceptable. Instead, outline the things you actually can do to help. Whether the task is as trivial as shepherding a customer’s phone call, or as complex as negotiating deliverables when someone cashes in an IOU chip, the right answer is to help frame expectations about what you can and can’t do, and on what timeframe. Skilled practitioners can give a “yes” that’s effectively a “no” through a respectful process of engaged and honest level-setting of expectations.
- Brush up on your psych and communication skills. Master the awesome power of behavioral economics and human psychology, and leverage your understanding of human psychology (Maslow’s Hierarchy, the Keirsey Temperament Sorter, etc.) to speak to people in a way that resonates with their specific needs and inclinations. Effective interpersonal communication is the chief way people fertilize relationships that ultimately nourish careers. For example, you wouldn’t explain the failure of an automated report in the same way to a tech-savvy perfectionist boss as you would to more forgiving boss who has trouble with the Intertubes. Study basic readings in human psychology and apply that newfound knowledge to your interactions within the workplace.
- Network. I’ve sat on too many hiring committees to believe that the best resume always wins. Opportunities are given to people, by people, so expanding your circle of fellow professionals you “know, like and trust” is vital to getting ahead. For example, my volunteer work within the National Association for Healthcare Quality and the Michigan Association for Healthcare Quality gives me access to a large number of peer professionals I can speak to about best practices, new ideas or approaches to vexing problems — and since we’re all working for different companies, we aren’t really in competition for anything nor do we have to manage hierarchical relationships. I’m acquainted with a heck of a lot of vice presidents and directors of quality at health systems across the country; if I am in the market for a higher-level role in five years in a different state, you bet your bottom dollar that I have “binders filled with people” who would recognize my name when my cover letter hits their inbox. And, significantly, vice-versa. Get involved early and engage vigorously with various national, state and local industry affiliation groups and connect with people at conferences.
- Play chess when others play checkers. Being successful in the long run requires solid strategic-thinking skills. Managers, it’s often said, handle day-to-day operational tasks, whereas leaders think about what tasks will be necessary five years down the road. Especially in fast-paced, rapid-turnaround environments, thinking about long term improvements proves more a luxury than a daily requirement. Yet it’s vital to encourage strategic planning. What good does it do a company, for example, to buy a new, pricy software app that only runs on Windows 7, when you know that in the next 18 months, the company will migrate to Windows 8? A respectful voice pushing against the fierce urgency of now, demanding that attention be paid to the ironclad demands of tomorrow, engenders more respect than the go-along kid who blindly follows others down avoidable dead-ends. Younger workers can employ what-if hypothesizing to introduce new ideas into a short-sighed project plan.
I look forward to this new chapter in my career that begins this week, yet I cannot help but wonder where I’d be today if my first dozen years as a working adult had been managed with greater care.
Posted on 15 June 2014 | No responses
Last weekend, whilst enjoying the sundry delights of the Pentwater Palace, my fearless friends and I trekked along one of the trails at Ludington State Park. I brought my camera (a Nikon D3100) and brought out for the first time my new Nikkor 55-300mm lens. A few highlights are shared, below.
Posted on 15 June 2014 | No responses
Much ado was made a few weeks ago about the European Union’s judicial determination that individual Europeans have, in broad strokes, a right to be forgotten on the Internet. Google protested, but now must honor requests to remove search results about a person from the E.U. at that person’s request.
Google, for its part, is suggesting passive-aggressive compliance — by following the directive strictly but publicizing that they removed the results and linking to the request to remove them. In other words, by shining an even brighter flashlight on the material intended for removal.
All of this comes back to two important questions:
- Who “owns” information about a person?
- To what extent can a private person control the release of information about himself?
The first question might appear before U.S. regulators sooner rather than later. The Federal Trade Commission launched an inquiry into data brokers and lawmakers are increasingly skeptical of the breezy privacy practices of these companies. The second question is murkier: Public records are public records, but to what extent does a private enterprise enjoy the right to profit off aggregating and publishing public records? Does the right to free speech mean the right to restrict dissemination of speech if the subject of that speech demands it?
When the E.U.’s decision hit the media wires, the response was predictable. Data brokers argue that it’s better to be served relevant ads than irrelevant ads, so consumers shouldn’t worry about what’s going on behind the curtain (never mind folks who don’t want to be served ads at all). Companies, in general, are increasingly reliant on large-scale data analysis to refine consumer targeting, so giving people the chance to opt out of that targeting directly affects their bottom line.
I believe that my information is my information, and that the only companies entitled to use my information are those I’ve elected to do business with. I’ve never conducted business with a data broker, so the data broker has no right to profit off the sale of information about me that it compiled through surveillance I didn’t authorize and wouldn’t consent to. As such, I support regulation that eliminates or tightly regulates consumer data-sharing among companies, as well as transparency and strong limits about what kinds of information can be collected and the consumer’s right to amendment or deletion.
The question of the right to be forgotten is more intriguing. Let’s say Bob writes a nasty blog post about me. Google indexes it and serves it up when someone searches for my name. What is to be done? Bob may be entitled to say nasty things, provided it doesn’t cross the line into defamation, but why should Google have a right to make that information easily discoverable? Don’t I have the right to have negative material affecting my reputation more difficult to discover? Google’s argument is a variation on the meme that “information wants to be free.” Bollocks. Google makes money on selling search results, so it doesn’t want to harm its core business, principle be damned. Bob can write what he wants to write, but Google has no First Amendment right to make that information discoverable, such that it trumps my right to avoid inappropriate public disapprobation.
It may be true that there’s no such thing as privacy in the digital age, but there’s something to be said about the effective privacy that comes from information obscurity. Bob publishing mean things about me is what it is, but I have a vested interest in not making Bob’s vitriol the first thing that pops up on search results about me. Making some things more difficult to casually uncover is probably a reasonable middle ground between victim’s rights and free-speech rights. Certainly, Google’s perspective that it’s entitled to link everything/everywhere is much more philosophically controversial than its defenders care to admit.
In any case: There’s a trend afoot to turn consumer data into a commodity. Fine. Then let’s regulate the data brokers and companies like Facebook and Google as if they’re utilities.
Posted on 26 May 2014 | No responses
A liberal looks at the country and, in his eagerness to immanentize the eschaton, rejects well-functioning tradition for want of some high-theoretic World State. A conservative looks at the country and, in his eagerness to restore long-abandoned traditions, rejects much scientific and cultural progress for want of Duck Dynasty. Yet a healthy body politic needs both visions; liberals and conservatives are merely opposite lobes of Uncle Sam’s lungs, diseased though each may be in its own special way. Lose one to cancer, you lose a lot.
Lose both, though, and you lose everything. The Zombie Apocalypse test is apropos: What really matters after catastrophe strikes? Think of an event like Hurricane Katrina, when public order in southern Louisiana was shaky for several weeks and ordinary survival became a genuine ordeal. In such a climate, does anyone really care about “trigger warnings” or carbon footprints or into which cathole the transgendered person gets to pee? Almost all of the current causes célèbres of the Left are what kids these days call #FirstWorldProblems. The issues that progressives adore are so irrelevant to life on the lower rungs of Maslow’s Hierarchy that it’s a wonder so many people invest so much time into advocating for so little substance.
Yet in that Katrina situation, the Right isn’t appreciably better. The preppers hide in their bunkers while the guys with guns take stuff from the guys with yoga mats. If public order is a long way off, you’re much more likely to end up with a descent into strongman-led tribalism, with a pecking order directly related to what you can contribute to the group in terms of rare skills or biceps size.
And therein lies the rub. Neither conservatives nor liberals currently articulate a comprehensive worldview that successfully encapsulates the value of ancient knowledge and antique skills, with a respect for the sundry joys of High Culture and a sophistication for harmonizing new insights with old wisdom. Today, we can afford to obsess about Facebook offering dozens of gender options. Tomorrow, when the Zombie Apocalypse comes, those same people who eagerly set their Facebook genders to “Cis Woman” or “Transmasculine” are unlikely to survive a week without dying of dehydration, injury or human-caused trauma. Today, we can afford to let conservatives be the voice of anti-elite sentiment. Tomorrow, when the Zombie Apocalypse comes, those same people who disdain higher education will be the first to chuck the last copy of War and Peace on the fire when the menfolk return with a fresh kill of some endangered species.
We might get lucky; we might get a world that looks like Falling Skies, with a healthy balance between warrior and academic leading the group. But we might end up with Lord of the Flies, instead. It scares me that I can’t tell which scenario is more probable.
We could, perhaps, console ourselves with the belief that the Zombie Apocalypse — a term of art, of course, for any great civilizational catastrophe — won’t occur. But such consolation is empty given the sprawling narrative of human history. The May edition of the estimable First Things included, as a feature article, “The Great War Revisited” by George Weigel. It is a masterclass narrative in a magazine that, itself, sets the high bar of literary merit.
Weigel recounts the willful blindness of world leaders in 1914. No one could quite believe that the stability of the Westphalian system could collapse so quickly and so completely in so little time, so they acted as if it couldn’t.
Consider. On January 1, 1910, Tsar Nicholas II ruled an ancient, vast, autocratic Russian empire. Kaiser Wilhelm ruled a powerful, prosperous Germany freshly ambitious after Bismarck’s consolidations a generation before. Emperor Franz Joseph ruled the elegant if creaky Austria-Hungary — since 1848, no less. The Ottomans were in control, albeit tenuously, in Istanbul and had been for more than half a millennium. The Qing Dynasty ruled a decrepit China through a monarchy with roots two millennia old. America was quiet and disinterested in foreign affairs, with William Howard Taft presiding over a prosperous, growing but inward-looking country.
On January 1, 1925 — a mere 15 years later — the Romanovs were decomposing in a shallow grave while the Soviet Union crushed internal dissidents on Stalin’s orders. Germany was a shambles, the harsh Peace of Versailles spreading misery among Germans of every stripe and depriving governments before Hitler of any real, legitimate power … thus sowing the seeds of the next major war. Austria and Hungary were cleaved apart and the last Ottoman Sultan, Mehmed VI, had been deposed while Ataturk began his secularizing work (potentially sparking the tinder of later Islamofascism, to boot). The KMT was consolidating control in a democratic China while Japanese forces still stung by the aftermath of the Russo-Japanese War of 1905 had correctly gauged the exhaustion of the West and plotted accordingly. The United States, after Woodrow Wilson’s collectivist war policies and internationalist exhortations, was enjoying the Roaring Twenties under Calvin Coolidge. And families across the world were still coping with the devastation wrought by the Spanish Flu pandemic of 1918.
All the things that looked so permanent in 1910 had been laid waste over five years of war and a decade of ill-managed peace. An entire generation had bled to death for naught on the fields of Europe, and others — India, Japan, China — took notice. The suicide of the West took some time, but each slice of the wrist was unmistakable —
- The sinking of the Titanic (1912) — we began to doubt scientific progress
- The Guns of August (1914) — we went to war because we couldn’t find a reason not to
- The battles of Somme and Verdun and Passchendaele (1916-1917) — we killed millions knowing it was futile
- European acquiescence to Hitler’s invasion of Czechoslovakia (1938) — we looked away from evil
- The Yalta Conference (1945) — we let Stalin get his spoils without a fight, condemning millions
- The Counterculture (ca. 1968) — we stopped being serious about shared culture
- The War on Terror (ca. 2001) — we over-reacted to a minor threat, then under-reacted to major threats
Imagine being a normal person born on January 1, 1890. You saw the entire world change before you greeted your first grandchild. You were born into a world without widespread automobiles, powered flight or amenities like indoor plumbing or electricity; as a child, you likely heard stories from your parents of the Civil War, the taming of the American Frontier and the era of tall ships. You lived through the Great War and World War II and the Cold War. If you lived to the ripe old age of 80, you died after seeing a man walk on the surface of the moon.
Think about that.
History is replete with moments in time where everything changed within a generation and old truths and new ideas fought bitterly for supremacy. The Great War was such an inflection point. So was the political upheaval of 1848. So were the Napoleonic Wars a generation earlier and the French Revolution that lit their fuse. So was the Reformation, starting with the 95 Theses posted in 1517 and persisting through centuries of wars of religion in Europe. So was the discovery of the New World in 1492. So were the Crusades. So were the crowning of Charlemagne, the Mongol invasions, the collapse of Rome and Constantine’s conversion to Christianity.
So why do we persist in thinking that such an earth-shattering event can never again occur? Why must we be so un-serious about the future that we can relish small-potatoes political idiocy as the world smolders while waiting for the tinder for the next world-historical dislocation?
Today’s domestic politics isn’t up to the task. Neither the Right nor the Left can articulate a coherent vision for what the world ought to look like next week, let alone a century hence.
Some of today’s more enlightened pundits — I’m thinking especially of George F. Will and Peggy Noonan — correctly note that the race for 2016 is hamstrung by both the Republicans and the Democrats lacking a consistent and comprehensive message about what they want for America. Debates currently focus on irrelevant personalities (Bill Clinton, the Koch Brothers) or on issues that aren’t really significant in the grand scale of things (marijuana legalization, the minimum wage). We’re back to small-ball politics.
But while politics is about legislative agendas, ideology is about the big picture. And on that front, all the main ideological voices in America lack a conceptual coherence that applies with equal validity and rigor to life on a college campus as well as life in a post-apocalyptic village. Ideology requires a conception of the human condition that applies regardless of any individual human’s specific condition. It requires a nuanced teleology. Ideology shapes politics, so with ideologies in disarray, it’s no surprise the our politics follows suit.
Progressive ideology spends so much time on harmonizing complex identity relationships that the framework it’s built upon cannot endure in adverse material conditions — what works in faculty lounges at Berkeley won’t work in a rural farming community in Nebraska, and certainly won’t work in a long-term survival situation. It fails the test of universal relevance. Conservative ideology lacks coherence on the big questions of life and human relationships; half of engaged conservatives appear quite willing to live within Leave It to Beaver and eschew politics entirely while the other half can’t figure out if it’s for or against the NSA, for or against starting council meetings with an invocation to Jesus, for or against vaccines. The libertarians fail to concede that humans are social animals, and that eusociality imperfectly squares with contractarian principles, so they seem like the rump at a linguistics conference that really, really wants you to believe that Esperanto is a logically superior alternative if only people would abandon their native tongues and give it a chance.
(Sneaky thought: You know who actually nails the big picture effectively? Catholics and Jews, and non-radicalized Muslims.)
I want conservatives, in particular, to advance a coherent framework that tells me what kind of America we aspire to in the year 2114. Don’t recite policy — recite the principles that policy will be shaped by. That framework will give a compelling, universal why as well as a specific answer to the tough questions we prefer to elide:
- If human life is precious, will we abolish the death penalty when we abolish abortion?
- Which is better: A well-reared child attached to two same-sex parents, or a poorly reared child of two opposite-sex parents?
- Under what circumstances will we invade a sovereign state? To acquire resources? To avert genocide? Never?
- Can we force children to get mandatory vaccination against parental consent, for diseases that could devastate large populations?
- Does human destiny reside in the United States, across the globe or among the stars?
- What should be in the public square, versus entirely private, versus private but subject to government monitoring?
- To what degree should individual risk be socialized?
- What is the purpose of a well-lived life?
- Is society stronger with a Judeo-Christian worldview, with a secular worldview or with a Greco-Roman ambivalence about religion?
- To what degree should a person be required know how to change a tire, raise a garden or build a fire in the backcountry?
- What is the point at which we agree that gulf between “have” and “have not” is too wide to tolerate?
- How do we balance libertarian autonomy with the stabilizing power of society’s little platoons, without rendering either useless?
- At what point does market inequality amount to de facto duress for the economically disadvantaged?
- What is the proper response to a person who is biologically female but professes to be male in gender?
- To what degree are people free to make choices that may not redound to their long-term advantage (smoking pot, eating too many cheeseburgers, avoiding dental exams, driving without a seatbelt, etc.)?
We can hope that the Zombie Apocalypse never comes, despite history’s ample lessons. But while we maintain this foolish hope, will we think prudently about what kind of life ought to persist between our cyclical catastrophes, or will we duck our heads in the sand and continue pretend that today’s hot-button social issues really do have meaning?
Posted on 17 May 2014 | No responses
A Brief Respite …
via Tumblr http://ift.tt/1o10IZh