What the NY Times missed about “Rizzoli & Isles”

Earlier this week, New York Times TV critic Mike Hale reviewed the opening of the seventh and final season of the TNT original cop show, “Rizzoli & Isles.”

Here’s a relevant snippet:

On television, as in life, comfort food comes in all sorts of flavors. There’s the tart apple pie of “NCIS,” the solid corned beef and cabbage of “Blue Bloods,” the wacky loco moco of “Hawaii Five-0.”

“Rizzoli & Isles,” which begins its seventh and final season on Monday night, is in the TNT section of the menu. Like those other shows, which are on CBS, it’s good reheated. But it’s lighter and easier to eat with one hand while doing other things. It’s the thin-crust pizza of prime time.

That makes it the type of show that doesn’t generally garner much attention when it announces that its run is ending. (The final 13-week season will bring the show to a more-than-respectable 105 episodes.) A lot of people will notice, though, when “Rizzoli & Isles,” a formulaic buddy-cop drama — with the twist that the buddies are women and one’s a medical examiner — goes away.

I’m more charmed by the series than Hales, but I accept and agree with his point that “Rizzoli & Isles” is light fare, a far cry from “House of Cards” or even “Orphan Black,” but he misconstrues the lookalike shows that bring what’s interesting and distressing about “Rizzoli & Isles” into focus.

The real consideration set should be the other Holmes & Watson pairings where an eccentric genius teams up with a normal-but-tough partner.  By this definition, the peer group for “Rizzoli & Isles” are “Castle,” “The Mentalist,” “Elementary” and “Psych.”

When you look at this group, then the thing that immediately pops out is how much more eccentric the male geniuses are allowed to be than Maura Isles (portrayed by Sasha Alexander).   Rick Castle, at least at the start of the series (ignore the stupid, audience-betraying finale from last month), was an impish, annoying horndog who happened to be preternaturally insightful.  Holmes in “Elementary,” is a recovering drug addict who might be on the autism spectrum.  Patrick Jane in “The Mentalist”  was morally ambiguous at best: a con man who was only working with the cops to get revenge for his slain family.  In “Psych,” Shawn Spencer is another con man, in this case a hyper-observant young man who pretends to have psychic powers as a gimmick for a detective agency.

So what, in comparison, is so eccentric about Dr. Maura Isles?

She’s usually over-dressed for her job as a medical examiner.

Oh, and she is a tiny bit socially awkward when not talking about science.

That’s it.

When you look at “Rizzoli & Isles” in this peer group, then you quickly see how much narrower our conception of women’s range of eccentricity is when compared to how men can be eccentric.

Unless, of course, the woman is playing a villain.  Then you can get Alexis Carrington, Cruella DeVille, or any of the other snarling, scenery-chewing bad girls.  Think of the delight the camera takes in all the previews and stills from Margot Robbie’s forthcoming portrayal of Harley Quinn in “Suicide Squad.”

A crazy man can be a hero or a villain.

A crazy woman is a villain.

I’ll keep tuning in, but just once I’d like to see the show runners of “Rizzoli & Isles” give Sasha Alexander the kind of acting material that Nathan Fillion, Simon Baker, Jonny Lee Miller and James Roday regularly received in their shows.

Where “Lucifer” on Fox goes off the rails

“Lucifer,” the new midseason replacement show on Fox, doesn’t trust its audience.

Episode #8 aired last night, and at this point the show is a basic police procedural with a celestial crisis (something bad will happen without Lucifer working as Hell’s CEO and main jailor) lurking vaguely in the background.  Tom Ellis is charming as Lucifer in a performance that is virtually a cover of Tom Hiddleston’s Loki from the Marvel movies.  Lauren German is the supermodel cop who plays the straight woman.

The debilitating, series-limiting problem with, isn’t that the show is offensive to Christians (it is, but that’s no surprise), nor is it that the show has the same narrative engine as “Castle,” “The Mentalist” and “Elementary”— what NPR’s Linda Holmes calls “The Adventures Of Mr. Superabilities And Detective Ladyskeptic.”  (I wonder if Lauren German, Stana Katic, Lucy Liu and Robin Tunney get together over tea to practice humorless looks on beautiful faces.)

There’s no way that any TV show could capture the sprawling story from Mike Carey’s “Lucfier” comic book (inspired by a Lucifer cameo in Neil Gaiman’s “The Sandman” comic book).

Instead, “Lucifer” fails because it tips its hand the moment the story opens: we know that Lucifer Morningstar really is the devil, has left Hell because he got tired of playing a role in God’s plot and instead moved to Los Angeles (the city of angels) to open a piano bar, at which point he meets a gorgeous female detective and they start an investigative partnership.

The missed opportunity with “Lucifer” was that the show could have been a tightly-focused “is he the devil or is he nuts?” exercise, one that could have lasted five seasons.  (I predict that “Lucifer” will run of out of story steam in Season Two.)

The show would have been more interesting and sustainable if every time Lucifer demonstrated unusual abilities there could have been two equally valid interpretations, only one of which is that he isn’t human.  Lucifer’s conversations with brother angel Amenadiel could have been shot in different stock or with a filter that suggests they might have been hallucinations. 

This twist would have not have affected the police procedural plots, the crimes of the week that Lucifer and Detective Chloe Decker solve, but it would have made the sustaining relationships more interesting.

The entire show, in other words, could have been more like Lucifer’s relationship with psychiatrist Dr. Linda Martin, in which she does not believe that he is the devil but agrees to work within his metaphor in treatment.  If the audience had doubts about Lucifer’s devilish status, then Dr. Martin’s torrid sexual relationship with Lucifer would then have been more ethically dodgy — and more dramatically interesting — since she would have done something wrong rather than been mesmerized by a supernaturally sexy being.

If the audience didn’t know whether Lucifer was the devil in retirement or just a mad human, then that would have made all the personal stories that come out of the narrative engine compelling.  In other words, the series could have played the Hamlet card: is the prince merely putting on an antic disposition or has he truly gone nuts?

But it’s not fair to saddle Lucifer with Shakespeare-sized expectations.  Instead, the series should have taken a page from the movie “The Sixth Sense.”  (Spoiler alert, although for a movie from 1999 I should hardly have to warn you.) 

At the end of that movie, the revelation that Dr. Malcolm Crowe (Bruce Willis) is himself a ghost, one of the dead people that the little boy can see and hear, casts everything the audience has just watched into a new, complex and compelling light. 

What’s important, though, is that even without that final twist “The Sixth Sense” is still a great story.  We don’t need the twist to have loved the movie, but it makes us reengage with the story as it ends (see notes).

In my alternate reality version of “Lucifer” (the one where Spock has a beard), Chloe’s relationship with Lucifer would have been vexed by her having to triangulate growing affection against a fear that he’s either a whacko or the devil in retirement… neither of which are high recommendations for a boyfriend you can bring home to meet Mom.

A five season arc might have gone like this:

Season One: we meet the characters, Lucifer and Chloe forge a partnership with potential.

Season Two: Chloe thinks that Lucifer is human and nuts, but she falls for him anyway.  She spends part of the season trying to figure out his real human identity but cannot.  They consummate their relationship.  The season ends with Lucifer doing something impossible.

Season Three: Chloe now believes that Lucifer really is the devil, and she is trying to figure out if sleeping with him has eternal consequences.  The season ends with Lucifer demonstrating mortal frailty (e.g., he gets shot, cliffhanger with him in the hospital).

Season Four: Lucifer, recovering from his wounds, begins to doubt his own story.  He refuses to talk with Amenadiel because, like the audience, he thinks that Amenadiel is a hallucination. 

Season Five: everything comes to a head.  The really weird stuff (e.g., Donald Trump is a presidential candidate) gets even weirder, and from the “he’s really the devil” perspective the gates of hell stand agape and it’s really time for Lucifer to get back to his real job.  Meanwhile, from the “he’s just nuts” perspective Chloe and Lucifer are faced with their need to move ahead or move on.  The last moment of the last episode finally reveals the truth, with no ambiguity or “Dr. Sam Becker never leaped home” betrayal.

I’ll keep watching “Lucifer” for another few episodes to see where the plot is going, but I fear that the answer is “to hell in a hand basket.”

Notes:

Don’t miss Linda Holmes’ piece that includes “The Adventures Of Mr. Superabilities And Detective Ladyskeptic.” HT to Alan Sepinwall [] for linking to this, and his own review of “Lucifer” is brief but strong.

Digression #1: “The Sixth Sense” is the opposite of “The Fight Club” (1999, and another unnecessary spoiler alert), which lies to the audience throughout because the Brad Pitt character turns out to have been a movie-long hallucination by the Edward Norton character.

Digression #2: I’ve long been fond of literary explorations of the devil, from Milton to Gaiman and including Jeremy Levin’s “Satan: his psychotherapy and cure by the unfortunate Dr. Kassler, JSPS,” “Glen Duncan’s “I, Lucifer,” and a terrific version in Robert Heinlein’s novel “Job: a comedy of justice.”

Cross-posted on Medium.com.

A glimmer of hope for CBS’ “Supergirl”

This TV season’s new “Supergirl” TV show confuses me.

Over on the CW, producer Greg Berlanti has nailed both “Arrow” and “Flash,” but where those shows feel fresh and exciting “Supergirl” is forced and whiney.

“Supergirl” oscillates between action and soap opera, rarely integrating the two. There’s too much talking among the characters but not nearly enough wit — the show is the anti-Buffy. Every character is earnest to such an extreme degree that I wonder what jokes get cracked in the writer’s room. Even Calista Flockhart’s media magnate Cat Grant can barely get a snicker out of her celebrity lifestyle.

But in the most-recent episode (#8, “Hostile Takeover”) I saw a glimmer of hope.

Warning: SPOILER ALERT!

Towards the end of this episode  —  which whiplashes between Supergirl fighting renegade Kryptonian villains led by her evil aunt Astra and trying to stop the ouster of her boss, Cat Grant, from her media company  —  Ms. Grant finally figures out that her long-suffering assistant Kara Danvers is also Supergirl, and proves it when she demands that Kara take off her glasses.

Thus, in one deft move “Supergirl” does away with 78 years of unbelievable super tradition that anybody with more than that many IQ points would be fooled by a pair of glasses.

Suddenly, Supergirl’s secret has been revealed to a media tycoon who must wrestle between her desire to help Supergirl and her desire to drive viewership to her network. Suddenly, Supergirl’s secret identity is more than merely at risk, it’s a leverage point against her all day, every day.

Now this conflict has potential. I just hope that Berlanti/CBS exploits it.

In fairness, the Super Family is hard to narrate. Even a relatively depowered version of Superman or Supergirl is so, well, super that it’s hard for merely mortal criminals to do anything that poses a challenge. That, I think, is why the show has aped the back story of “Man of Steel” where good Kryptonian goes mano a super-mano against many bad Kryptonians: it’s the only way to have a fair fight.

It’s also boring.

The structure of “Supergirl” combines Kara’s “why can’t I be a real girl?” soap opera against the alien of the week arriving in town from a defunct Kryptonian prison, complete with slightly upscaled Doctor Who makeup.

There’s no real challenge in any of the aliens, and no season long “big bad” (like Tom Cavanagh’s complicated and compelling Harrison Wells in the first season of “The Flash”) to absorb the audience’s attention, and that means that the interest has to come from the soap opera.

For those of you curious about this show, the pilot reruns tonight.

[Cross-posted on Medium.]

The Girl in the Spider’s Web isn’t terrible, isn’t great

Over the weekend I zoomed through the new David Lagercrantz novel, The Girl in the Spider’s Web, which is the not-written-by-Stieg-Larsson sequel to the Millenium Trilogy that started with The Girl with the Dragon Tattoo.

I’ll start with some thoughts about the book itself — so you have your spoiler alert — but I’ll wind up this post with some thoughts about the the aesthetics of ephemera and vice versa.

About the novel: It’s a good gulp-it-down novel, quickly plotted and dark in similar ways to the Larsson books (although not nearly as dark as Larsson’s third, which sucked the light of out the room where I was reading it).

But the book feels unnecessary. After the riveting revelations about Salander’s childhood in Larsson’s third book, The Girl who Kicked the Hornet’s Nest, there’s not much left to say about Lisbeth Salander’s past, and any changes to the character in service of a future would risk betraying the readers who want more of the same. This is a terrible trap for a novelist.

Lagercrantz couldn’t escape the trap, so he has reduced Salander to a series of narrative functions rather like what happened to Sherlock Holmes in the Holmes stories written by others after Conan Doyle’s death (and there are thousands). In most of these stories, Holmes is a pastiche of narrative-advancing tricks (he deduces that Watson been to the horse races from a bit of straw on Watson’s shoe, causing gullible Watson always to be astounded yet again) rather than a character that interests the reader himself. With the exception of Nicholas Meyer’s The Seven Percent Solution, talking about Holmes as a character is like talking about Batman’s utility belt as a character— it’s not all that useful.

In the post-Larsson world of the Lagercrantz, Salander is an angry superhero, superhacker, protector of innocents who bursts onto the scene regularly, makes things happen, and then disappears. 

The Girl in the Spider’s Web is a misleading title for this book, since Salander is never caught, never motionless, never the prey despite being hunted— she is the predator.

I don’t regret reading the book — despite my sense that it serves the publisher’s greed rather than the readers’ need — but I probably won’t read the next one, and I’m sure there will be a next one.

The aesthetics of ephemera: Perhaps more importantly, I don’t regret reading the book last weekend— my satisfaction index will never be higher than just a few days after its August 27th release date. The longer I wait, the more information from the world will trickle in to spoil my fun.

This isn’t just true of The Girl in the Spider’s Web, of course. The reason that a movie’s lifetime economic success usually is a function of its opening weekend is that the water cooler conversation about a movie is at its frothiest after opening weekend. 

I love to see movies (particularly popcorn movies) opening weekend — although I rarely get to do so — because that’s the moment of maximum potential for having that explosive moment of connection in my own head to other movies and works, and it’s also the moment of maximum potential for having fun discussions with other people about the movie and its broader context.

But the longer I wait to see a movie, the more likely I’ll hear something about it that will diminish that connection-making pleasure for me. I’m not talking about classic “the girl’s really a guy!” plot spoilers, although those suck. Instead, I’m talking about those trying-to-be-helpful hints that come from people who’ve already seen the movie. “I’m not going to tell you anything, but you have to stay all the way to the end of the credits: it’s really cool!”

This is a horrible thing to say to somebody going to a movie you’ve already seen since it means that the viewer will detach from the climax of the movie early, in order to focus on the extra coming after the end.

The ephemera of aesthetics: We don’t have good language to talk about this phenomenon, the very short half-life of the water cooler effect on how we experience culture.

We’re good at talking about the work itself, the creation of the work, the background and previous efforts of the creators of the work.

But we’re bad at talking about how we are a moving point in time relative to the work, and how satisfaction decays with some works but deepens with others.

For example, I’ve been a fan of Lois McMaster Bujold’s Vorkosigan series for about 20 years now, and they merit re-reading. I see new things in the characters, the plot, and her writing when I revisit the books. Although Bujold’s books are masterfully plotted, I can’t reduce my satisfaction with her books to the plot, and this is good.

Lagercrantz’s book is entirely about the plot: at the end of the story all the energy has been released from the plot, a bunch of the characters are either dead or narratively exhausted, and Salander will need to be released into a new situation to exercise her narrative function.

Some sorts of aesthetic experience, then, are fragile in Nasism Nicholas Taleb’s notion of fragility and antifragility.

Plot is fragile. Character is not inherently, but for a character to be antifragile that character must exceed the needs of the plot in which the character embedded. 

Ironically, inside the world of The Girl in the Spider’s Web Lisbeth Salander is indestructible: nothing stops her. Meanwhile, for this reader the experience of reading about Salander’s latest adventure is soap bubble ephemeral.

Pop.

[Cross posted with Medium.]

High Fidelity, Pillow Talk, The Music Man: on technology and on ideas that rhyme, but then don’t

Hey, this looks like a piece about old movies, and it starts out that way, but it’s also about how to think about technology. I even throw in a little Douglas Adams at the end.

Ideas can rhyme like words do. When words rhyme, the rhyme helps us position ourselves inside a poem: we know we’re at the end of a line when the rhyme happens. Rhyme’s spatial nature makes us pay attention to a similarity that doesn’t mean anything most of the time. “Bed” always sounds like “dead,” but we only notice — we only think it means something — when and where both words come at the end of nearby lines in a poem or song in a rhyme.

When ideas rhyme, the rhyme helps us position ourselves inside a story, fictional or non-fictional. Ideas that rhyme* are building blocks for analogies.

Here’s what I mean: one night some years back on a driving trip with my then-eight-year-old daughter we took in a show at the Oregon Shakespeare Festival. The classical plays were all too dark (Macbeth) or esoteric (King Henry VIII) to choose as a little girl’s first introduction to Shakespeare, so instead I took her to The Music Man thinking that this 1950s musical set in the early years of the twentieth century would be more approachable for my twenty-first century kid.

Boy, was I wrong. The gap separating her everyday experience from The Music Man was no narrower than if I’d shown her the Thane of Cawdor’s ancient Scotland. River City, Iowa, after all, is so sleepy that when the Wells Fargo wagon arrives the whole town breaks into celebratory song. Try that today and the UPS guy will floor the big brown truck.

Little about The Music Man resonated with H since the plot depends on information paucity: con artist Harold Hill is able to fleece town after town because the towns don’t have ready communications with each other; Marian the Librarian has to send a letter to Gary, Indiana to determine when the music conservatory there was founded… and then she waits for most of the play to get the answer. There may be trouble in River City, but there’s not a lot of information.

In contrast, at eight H had grown up in an information superabundant world with Google, email, hundreds of TV channels, infinite options online, books aplenty, music streaming from every direction, and frequent Amazon deliveries to our doorstep although not yet by drone. Today, with mobile and social media there’s even more.

When you’re working with ideas that rhyme you can understand stories that don’t bear a lot of resemblance to your own life because there are structures buried inside the narrative ecosystems that look a lot like the structures buried inside your own life.

When we saw The Music Man, H had just finished second grade and was about to enter third. The main reason H could appreciate The Music Man at all, which she did, was that going to elementary, middle and high school today is a lot like River City. The shapes of the experiences rhyme with each other: the same people milling about the same claustrophobia-inducing place doing the same stuff day after day… just with the addition of a barbershop quartet.

What interests me is when ideas stop rhyming. As a researcher and futurist, I’m always on the lookout for cultural narratives that stop making easy sense as everyday life changes. I use those transitions to dig into how our lives are changing as technology and culture do an evolutionary dance with each other. 

This isn’t just a matter of having lived your entire life with Batman as a dark movie character (Christian Bale) rather than the sunny TV one (Adam West) your parents remember with happy smiles— the sort of thing that the Beloit Mindset list captures each fall. My kids have no trouble watching the 1960s Batman TV show with me and understanding it, even though they think I’m ridiculous for loving it.

Instead, I’m talking about stories that dwindle into inaccessibility because the audience no longer shares enough context with the story to understand it without footnotes— like trying to rhyme “bed” and “guava.” Indeed, the very presence of footnotes is a clear sign that a story belongs to the past more than the present.**

Pillow Talk is a perfect example of a movie that no longer makes easy sense and that stopped making sense over the last decade. The idea that stopped rhyming is the party line, where multiple houses or apartments share a single phone line because there were more people who wanted phone lines than the phone company had yet built.

Even back in 1959, when Pillow Talk (the first Rock Hudson and Doris Day movie) hit the theaters, party lines were already on their way out.

But growing up in the 1980s the absence of party lines in my parents’ house didn’t stop me from understanding the movie. Pillow Talk made sense because I had to deal with the similarly-shaped frustration of never being able to have a private conversation. It was impossible to chat with a girlfriend and not have other people in both houses picking up the handsets every few minutes. So Pillow Talk was just like my house, only Jan Morrow (the Doris Day character) had to deal with complete strangers interrupting instead of clueless parents and malicious little brothers. The shapes of the experiences rhymed.

But those ideas no longer rhyme today.

It would never occur to now-teenaged H to use the house line to call her friends, and even if she did the likelihood of somebody else picking up the seldom-used house line is small. H has an iPhone, and since she doesn’t know any of her friends’ numbers by heart she would have to look at the address book in her iPhone in order to punch the number into the house line, which is absurd. Moreover, H wouldn’t want to use her mouth to make noises to communicate with her friends in the first place: that’s what texting, Instagram and Snapchat are for. If H does want to chat in real time, she’s more likely to use FaceTime or Skype.

The telephone is for talking with grandparents.

The plot of Pillow Talk revolves around technology barriers, but the technology in question no longer plays a role in the lives of children today. Universal Pictures couldn’t remake Pillow Talk unless the studio decided to set it in the same period when it was written, which would be pointless.

Which brings me to my disheartening realization last night after I stumbled across High Fidelity on Netflix and sank with a happy smile into that delightfully written, perfectly-directed, amazingly-cast film. (And is there a better soundtrack in movies?) 

The realization? My kids won’t understand High Fidelity. Oh, I’ll try to show it to them, and perhaps the charming performances will suck them in, but it’s unlikely because the ideas don’t rhyme.

Want to know why? Just look at this timeline…

1995: the novel High Fidelity by Nick Hornby hits bookstore shelves, which means it had been completed sometime in 1993 or 1994. It’s a love story set in a world of used vinyl record shops run by monkish musical obsessives. It is a world that is about to die because…

1997: the first MP3 player is released.

1999: Napster makes music sharing (and piracy) effortless, and shifts the musical unit of measurement from the physical album or CD to the individual song in MP3 form.

2000: High Fidelity the movie, starring John Cusack, comes out; Pandora Radio launches, bringing streaming radio and music discovery to the world.

2001: Apple iTunes and Apple’s first generation iPod come out, taking MP3s mainstream; H is born.

Some of the key moments in the movie (ahem, spoiler alert… although if you haven’t seen the movie already then how did you make it this far in this post?) don’t make easy sense if you’ve grown up with today’s technology.

Sure, there are still record stores out there that cater to music snobs, but not many, not one in every neighborhood and every shopping mall— and rumbling around the record store with your friends on a Saturday afternoon isn’t an activity for today’s teens the way is was for previous generations. 

In High Fidelity, as they hang around Championship Records waiting for customers to stumble in, many of the interactions among Rob, Dick and Barry concern musical trivia: which group first performed which song, who did what on which record. Today, Google would answer all such questions.

Rob using a stack of quarters to call his ex Laura over and over again from a rain-drenched phone booth outside her new beau Ray’s apartment is dramatic and emotionally charged. Today, good luck finding a pay phone. Rob would simply hit redial on his smart phone while sitting at Starbucks.

Rob making mix tapes for Caroline the music reporter and for Laura were time-draining labors of love in the 1990s, but today the process is instantaneous and not impactful. “Here, I spent hours selecting, organizing and recording this tape for you” becomes, “here, I spent minutes selecting and seconds creating this Spotify playlist for you: it’ll take you longer to listen to it than it took me to make it. Have fun!”

In the vinyl days, when you heard about a new band or a new song you had to find a disc-shaped object somewhere, either in a store or at a friend’s house. When cassettes came along you could copy things, but you could only do it in real time, which was a drag.

Teens today love music just as much as they ever have, but their musical challenges are about filtering rather than access. There’s instant streaming for just about everything, but the challenge is figuring out what to listen to, whether it’s free and where to find the best deal if it isn’t.

High Fidelity — a movie that, sigh, I still think of as relatively new — is more distant from my kids’ automatically understandable experience than The Music Man. How weird.

Watching stories move into the rear-view mirror — when ideas stop rhyming — is the flip side of watching technologies move into the unremarkable mainstream of our everyday lives where of course I can reach my wife with a stupid question about where the charger for the laptop might be hiding because she has an iPhone and I haven’t thought to look under the bed.

Douglas Adams, in a 1999 blog post called “How to Stop Worrying and Learn to Love the Internet” articulated this brilliantly:

I suppose earlier generations had to sit through all this huffing and puffing with the invention of television, the phone, cinema, radio, the car, the bicycle, printing, the wheel and so on, but you would think we would learn the way these things work, which is this:

1) everything that’s already in the world when you’re born is just normal;

2) anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it;

3) anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilization as we know it until it’s been around for about ten years when it gradually turns out to be alright really.

Apply this list to movies, rock music, word processors and mobile phones to work out how old you are.

With rhyming ideas, we can tweak this into a slightly different three-fold story:

1) everything that’s already in the world when you’re born is just normal;

2) anything that stops being relevant to everyday life between then and before you turn thirty (fax machines, type writers, cameras, VCRs) takes up space in your garage and makes you smile with nostalgia when you run across it;

3) anything that stops being relevant after you’re thirty is a painful sign that you’re going to die sooner than you want to and that your kids don’t appreciate how good they have it until you gradually realize that this is just like when you rolled your eyes at your own parents when they nattered on about black and white TV and wringing out laundry and mixing food dye into margarine and that sort of thing so just relax because it happens to everybody.

Still, though…High Fidelity is a terrific movie. Too bad my kids won’t get it.

[Cross-posted on Medium.]

Miscellaneous notes:

* My notion of “ideas that rhyme” is similar to but not identical with my old teacher Stephen Booth’s description of “ideational rhyme,” which he works out in detail in his edition of Shakespeare’s sonnets.

** Simply putting the words “bed” and “guava” into proximity with each other might provoke readers with too much time on their hands to think about how these two things actually do go together. If this is the case, then for heaven’s sake please share with me how you think they go together because that sounds fascinating, and then please go read Donald Davidson’s 1978 essay “What Metaphors Mean” because it describes how our minds creates meanings out of these sorts of comparisons rather than discovering meanings that are already there.

Stewart, Cosby, Williams: Tough Times for U.S. Comedy

Take heed, sirrah, the whip.
   King Lear to his Fool

Jon Stewart’s farewell episode of The Daily Show last night proved joyful rather than sad as dozens of people whose careers took root and bloomed under Stewart’s watch turned up to celebrate and — despite his resistance — to thank him.

For the under-30 crowd, last night was their May 22, 1992: Johnny Carson’s last episode of The Tonight Show.  Unlike Carson, Stewart has no plans to disappear from public life; yet more dissimilar Stewart is universally reported to be a great guy rather than a jerk.

No reasonable person can fault Stewart for wanting to do something new after brilliant 17 years, but it’s a stabbing loss to nightly political commentary and to comedy. 

Funny people abound in U.S. comedy — and I’ve now reached my tautology quotient for the day — but in different ways we’ve lost three icons in the last year, Stewart the most recent.

Bill Cosby was the second: like Stewart, Cosby is alive, but since Hannibal Buress put the spotlight on Cosby’s history of sexual assault last fall all the joy Cosby had brought to us over the decades tastes sour.  Don’t get me wrong: Buress was right to do it, and it’s a shame on us all that until a man said it nobody took alleged attacks on women seriously. 

And I mourn the loss of the joy.  For most of my life, Cosby’s voice hasn’t been far from my inner ear.  Just this morning I found myself thinking about an early routine called “Roland and the Roller Coaster,” but then frowned as all the stories of his assaults on women rolled into my mind. 

I’ve heard stories of Cosby’s infidelity since I was in high school.  One of the dubious privileges of growing up in L.A. is knowing a lot of celebrities and their kids.  I was in a play with the kid of a famous woman who knew Cosby well.  I don’t know how it came up — I must have been merrily quoting a Cosby routine — but the kid said, “you know he cheats on his wife all the time, right?”  I don’t remember having an intelligent response beyond, “oh.”  Even then, infidelity was something that struck me as being an issue among the people directly involved rather than the public’s business. 

I remained a Cosby fan, and his observations intertwined with those of George Carlin as a running commentary in my head.

Now when I hear Cosby’s voice in my head I change the mental channel with a flinch.

It’s the second time that I’ve found myself dancing across the minefield of my own responses to Cosby: the first was after the mysterious 1997 murder of his son Ennis just a couple of miles from where I grew up.  After that, I couldn’t listen to any of Cosby’s routines about his kids, and particularly his son, without sadness. 

But I still listened. 

Not anymore.

Next week bring the one-year anniversary of the third and most grievous loss, the suicide of Robin Williams. 

A friend stumbled across LIFE magazine’s tribute issue to Williams at a garage sale and bought it for me, as she knew I was a huge fan.  I’ll read it on Tuesday, on the anniversary of his death, but I haven’t been able to open it yet.

I had the privilege of seeing the incandescent Robin Williams perform live onstage three times and saw or listened to him numberless other times.  The speed and depth and genius of his wit will never leave me.  His 2001 appearance on Inside the Actor’s Studio with James Lipton was the most astonishing display of mental gymnastics that I’ve ever seen.

Darkness always lives in comedy, and when the light is that bright the simple math of it says that shadows must go deep.  I wish I could have done something for him, even though we never met.  I understand this but I still can’t accept it: the funniest man in the world killed himself.

Dustin Hoffman captured the unfathomable, unacceptable, incomprehensible nature of Willams’ suicide in an unguarded moment during an onstage interview with Alec Baldwin that later became a June episode of Baldwin’s wonderful Here’s the Thing podcast.  Hoffman was talking about Lenny Bruce, and how Bruce didn’t prepare set material.  The only other person Hoffman could think of who was like Bruce was Robin Williams.  As he said the name, Hoffman broke down in a sob that hit him like a lightning bolt from a clear blue sky, and it took him several seconds to collect himself.  I cried too.

Good luck, Jon Stewart, and thanks. 

Bill Cosby, I wish you were as good a man as you are a funny man, although that’s a tall order.

Robin Williams, rest in peace.  You deserve it.

[Cross-posted on Medium.]

Michael Wolff’s Just-Released Book is a Puzzler

The dust-jacket of Television is the New Television: the Unexpected Triumph of Old Media in the Digital Age describes Wolff as a man with unparalleled access to powerful figures in media and the book as something that will change the reader’s thinking. Moreover, it frames Wolff as an archly bitchy writer with enemies who would like nothing better than to see his vital organs recycled to serve the more deserving. 

None of this is the case.

The vast majority of Wolff’s writing is a sober and thoughtful (if limited and unsupported) account of the power dynamics in media at the moment.  It’s a useful snapshot of what’s happening right now, which reduced to tweet-length is “Print and digital media companies are all turning to video to create brand advertising-worthy products, but TV is hard to make on the cheap.”  This is not an insight that changes my thinking.

The book has mild flights of interesting speculation. For example, without sports digital media companies like Google or Facebook will never be able truly to compete with broadcast and cable companies, but the digital companies have neither the stomach to write the big checks nor the narrative skills to create the product the right way if they did.  However, I didn’t take a single note or make a single check mark in the margin as I was reading (rare for me). 

The occasional outbursts of temper — heralded on the back cover as one of the book’s guilty pleasures — are mean-spirited cheap shots rather than Oscar Wildean performances of acerbic wit.  Rather than a frightened editor red-lining a bad-manners manifesto to prevent riots, as I read the book I imagined a desperate editor pleading with Wolff to make it just a little bit more cruel, please, Michael, whereupon Wolff, with a fatigued eye roll, would throw in something nasty like adding a dash of salt to soup.

Nonetheless, if you work in the digital media industry, then you should read Wolff’s book immediately for two reasons.

First, old media people — who long for less-complicated days before the arrival of the internet with its staggering complexity (and, yes, these folks are still around and in positions of power) — will wave this book about with a dash of glee and a little Rumpelstiltskin dance.  The book panders to the vanity of television people in a way that should make the team at Merriam Webster update their definition of “sycophancy” with a new example.

Second, if you wait even a week the book will be less enjoyable because it will be dated.  Ben Jonson eulogized his pal Shakespeare by saying Shakespeare was not for an age but for all time.  Wolff’s book is of the moment— just the moment.  With each passing development — such as 1) when the government did not allow the merger of Comcast and Time Warner Cable or 2) Verizon’s acquisition of AOL, both of which happened as the book was already in press — the picture it paints of the media world becomes more stale.  This book will be on the remainder shelf soon, by which point it won’t be worth reading.

It’s a quick read, and one that pulls a bunch of things together neatly.  Anybody working at the collisions of video and the internet and advertising will find it handy. 

But it’s in the bubble, parochial and like that famous “View of the World from 9th Avenue ” New Yorker cover.  Wolff’s imagination of what is important lacks scope: the internet isn’t important in the history of our species because of what it does for media but because of how it empowers people to communicate with each other— even if what they talk about tends to be television. 

Speaking as a researcher, editor, and writer I have to end this note with frustration.  Wolff is a columnist, not a journalist.  He cites not a single source, gets not a single person on the record, and has not a single footnote directing the reader to where his many assertions can be supported.  Television is the New Television is a collection of long columns: occasionally insightful, informed by numberless conversations with figures in the community, and with a short half life.

[Cross-posted on Medium.]

Looking Back on “The Fall Guy” — an Aria of 80s Sexism

The Lee Majors-crooned theme song from his old TV show “The Fall Guy” snuck into my head this morning.  It’s a stumper as to why or how this happened, and it proves only that I watched way too much TV in my youth.

The series (about a stunt man who is also a bounty hunter with two young assistants) ran 1981-1986 with a staggering 19.9 rating.  The theme song (which ran for a staggering minute and forty-one seconds) got stuck deep enough in my cranium that some odd collision of neurons brought it back up to consciousness.

So I went to YouTube, repository of all video ephemera, and found the theme song in seconds:

The first two lines (four seconds) of the “Fall Guy” theme song reveal a lot about the show:

Well, I’m not the kind to kiss and tell,

But I’ve been seen with Farrah.

Nobody under 40 will get the reference: Lee Majors was married to Farrah Fawcett in 1976, when she hit it big with “Charlie’s Angels.”  He was already a big hit with “The Six Million Dollar Man.”  They separated in 1979, divorced in 1982, but apparently remained on stable enough terms for him to include the reference and for her to cameo in the 1981 pilot.

What’s interesting about the reference is that in the first moments of the series (and each episode) the viewer is reminded that the star of the show — not the character, the star — has  been famous for quite a while, played other characters, had a famous marriage to another star.  From the first moment, in other words, the series deliberately blurs the line between star and character.

This is somewhat rare for movies and television.  Ordinarily, our default definition of “immersion” is to think of losing ourselves completely in a story where we forget that the character is played by a person with a life.  Instead, “The Fall Guy” credits — and the series’ habit of getting celebrities to make cameo appearances — articulated a different sort of immersion that included both characters and performers.

Back to the theme song: this morning I sunk into my chair to watch the opening credits on YouTube… and found myself surprised by an aria of sexism across at least three dimensions. 

Dimension #1: the theme song, “The Unknown Stuntman” (lyrics by series creator Glen A Larson, Gail Jensen, and David Sommerville) talks about how the singer performs deadly stunts with actresses, only to then watch as the actresses fall into the arms of their leading men rather than into those of the brave stuntman.  But the singer is a chauvinist, saying that he has “never been with anything less than a 9” and brags that “I never spend much time in school, but I taught ladies plenty.”  It’s not a surprise that the ladies decline to spend time with him.

There’s also a weird anti-logic in the song in which the male actors are valuable enough to merit stunt doubles but the actresses do their own stunts.

Dimension #2 isn’t about the theme itself, but about the history of the performers named in the song: Farrah Fawcett, Bo Derek, Sally Field, Cheryl Tiegs, Raquel Welch, Robert Redford and Clint Eastwood. Of them all, actor-directors Redford and Eastwood still have vibrant careers, while the actresses are all either dead or largely retired.  Career longevity for women is criminally shorter in Hollywood than it is for men.


Dimension #3: in the opening credits, there are many, many images of Lee Majors and co-star Douglas Barr in various action sequences, but only one image of the third stunt person cum bounty hunter, Heather Thomas, who wanders through swinging doors in a tiny bikini.

While “The Fall Guy” was never my favorite show, I did have fond memories of it… memories now complicated.  Allegedly, there’s a movie version in development with Dwayne Johnson attached. 

I wonder if they’ll keep the song.

[Cross-posted on Medium.]

Is Netflix moving away from binge viewing?

American culture’s long nightmare has ended. Breath can release from empurpling faces across this mighty land: we now know Chelsea Handler’s next move. This morning, Variety broke the story: “Netflix Announces Chelsea Handler Talk Show to Debut in 2016: Comedienne to create new talk-show format and specials for streamer, after seven-year run on E!”

How will it work? Can Chelsea Handler retain an audience in an on-demand video environment?

Handler has forged a witty and wicked brand for herself as talk show host, comedian and author, but “Chelsea Lately” isn’t appointment viewing— it’s just always on, in continuous rotation, and reliably snarky and gossipy about the days celebrity news. Her books are the same: I’d never order one on Amazon, but I’m happy to look one over when I’m bored at the airport.

Most engagements with Netflix are in the long tail, with a  short fat head of original content like “Orange is the New Black” and “House of Cards,” and Netflix famously releases an entire season at once to promote binge watching.

Binge watching takes takes advantage of our human love of story and endless desire to see what happens next. That dynamic doesn’t operate in talk shows.

Or does this mean that Netflix is getting into the business of timely watching? Do they want to own 11:00pm Monday through Friday (the current slot for “Chelsea Lately”) with new and original content that will get stale as the hours tick on? 

If so, then why? Is this merely to drive subscriptions with experiences viewers can’t get elsewhere? Are they going to dip a cautious toe into advertising-supported programs… which the talk show format supports perfectly?

Note: This isn’t the first time we’ve seen this idea surface: back in 2010 when Conan O’Brien was fired from “The Tonight Show” he flirted with bringing a new talk show to Xbox Live before landing on TBS.

[Cross-posted on Medium.]

From the Archive: Why does “on demand” feel so… demanding?

A kind tweet today from my friend David Daniel reminded me of this post, first published October 1, 2006. A look through my site found it a casualty of a domain transfer, but the always-useful Wayback Machine at the Internet Archive brought it back from the dead. Original version (with original comments) can be found here, and a subsequent story by CNN can be found here.  I don’t remember where I coined the term “TiVo Guilt,” but this at least is where the thinking comes from…

Having the best of the media world at my fingertips via cable VOD, TiVo, DVD, the internet and the metric ton of videotapes still lying around my garage can be a drag.

A couple nights back I cleared out the episodes of “House” piling up in my TiVo because they had ceased to be a special treat that I was saving for myself and started to feel like a homework assignment I’d forgotten to turn in. Similarly, my wife and I have had the many-many-Emmy-winning “Elizabeth I” in TiVo since April– April! We’ll never watch it, but we can’t bear to raise the white flag.

TiVo works best for short delays… watching something 15 minutes after it starts so I can blaze through without commercials,* or later that night, or the next night, but not much after that. I saw the second episode of “Studio 60 on the Sunset Strip” on Tuesday night and loved it… almost as much as if I’d seen it on Monday night when it aired.

A short delay preserves that sense of sobytiinost — the “eventness” or connectedness with other people that I’ve talked about before — but eventness doesn’t last long. Like a delicate radioactive isotope, it has a half-life measured in mere hours.

The longer a piece of media lies around unseen, the more my satisfaction index declines. To be more precise, as a piece of media lies around, I find the prospect of watching it less satisfying. If I actually hurdle the barrier of my lack of desire to watch something once it’s no longer new, then I’ll probably enjoy it, although not, I suspect, as much as if I’d watched it when it was newer or live.

This is not just true of media that I can get in my home. With big, tentpole movies that have had expensive marketing campaigns and been drilled into my consciousness, I’d better see them close to the opening weekend or I’m going to wait for it to surface on HBO. If I’m only dimly aware of a movie — or if I’m seeing just about anything with my daughter who brings to the theater her own eventness — then this dynamic doesn’t happen. And as a corollary, with a sleeper like “Keeping Up with the Steins” the expectation — the sense of being about to be in on something that the unwashed multitude hasn’t heard about — brings its own anticipation.

Late at night, beached on the couch with the dog at my feet while the rest of the family sleeps, the last thing I want to do is make a well-informed, thoughtful, right-thinking decision about how I’m going to spend the next hour. That all sounds like such a commitment: I just want to relax. Dredging something from TiVo’s bowels means that I’ll have to make judgments and decisions: Should I have recorded this? Should I cancel the season pass? Is it more important that I watch this thing than that other thing? It’s supposed to be entertainment for heaven’s sake… not a test of my skill as a TiVo user.

The Year Five data from the Center for the Digital Future found that Americans now log onto the internet with no objective… just to spend time having fun. That’s a key distinction between self improvement and self renewal– between an ought to do and a want to do. If I wait too long, TiVo transmogrifies the former into the latter.

Comments, please?

* To all my brand marketer friends out there… yes, I admit it: I use TiVo to skip the commercials. Sorry, y’all.