Reflections on the Baltimore Riots

Let me preface this by saying I didn’t sleep much last night, so my ideas are all half-baked. I am tempted to blame my insatiable curiosity about the Baltimore riots on the industry I work in, but it may be the other way around. Regardless, I could not look away from Twitter last night despite the horrible images, violent videos, and – worst of all – the oversimplified commentary coming from people around the country.

I knew immediately how I didn’t feel about the riots: I wasn’t excited or vindicated by the people I saw taking the streets with violence, nor did I completely condemn those involved. But until now, I haven’t been able to articulate how I do feel.

It came to me this morning. A friend and I spoke about the tragic events unfolding with all the perspective of one restless night.

“I don’t feel like I can have an opinion because I’m not there,” she told me.

For context, this friend lives in Charleston, South Carolina. You may remember, North Charleston recently made national news headlines when the New York Times released a video showing a white police officer shooting a fleeing, unarmed black man.

I, on the other hand, am in Rochester, New York. Yesterday I was in court covering the trial of a young black man who shot and killed a white police officer during a foot chase.

Maybe these incidents, taken at face value, are not directly related, but they and others are all connected. Those of us in other cities across the United States watched as Baltimore burned. We experienced emotion across the spectrum from heartbreak to frustration to anger, and some of us wonder if it could happen here.

We live in cities where income disparity and racial segregation draw lines criss-crossing through our neighborhoods but we look at places like Ferguson, North Charleston, Baltimore as if they are worlds away.

I don’t know the answer to the deep-rooted, systemic problems that plague cities across this country – how do you end something like poverty? – but I know that burying our heads in the sand and saying NIMBY is only making matters worse.

When the fires are put out and the rioters go home and the media frenzy that has swarmed up around these tragic events finally quiets, we cannot forget how this started, because what happened in Baltimore not only could happen anywhere, but it already is.

UPDATE: Here are some people that say stuff and write things better than I do:

Jelani Cobb’s Baltimore and the State of American Cities

Ta-Nehisi Coates’ Nonviolence as Compliance

Sentiment Analysis

crazy head gearI’ve been following some cool work coming out of University of Rochester. I did a story for WXXI here, about how researchers in the computer science department teamed up with researchers in the medical department to create software that could help track your mood and monitor mental health.

I spoke with Jiebo Luo, an engineer, and Vince Silenzio, a psychiatrist. To most people, the most notable aspect of their research is the selfie-for-your-health-y video feature. Users take a vid of their faces with a front-facing camera and a computer tells them what they’re emoting. The software reads cues like facial ticks, micro-expressions, and other nonverbal cues. It can gauge your heart rate by analyzing your forehead. It can register the dilation of your pupils. And, of course, it can read whether or not you’re smiling. (Victims of BRF beware.)

appThe less-sexy part of this software, which Luo says could potentially be turned into a smartphone app, has nothing to do with videos. The program also analyzes its users social media use. Everything from how fast your scrolling to the kitten picture you reposted is judged. It also analyzes your network, and puts a value on every link that connects that network, from friendships between users, to likes, to posts or comments.

Right now, the program only registers emotions as positive, negative, or neutral. As far as algorithms are concerned, every interaction you have with your social media has a value of +, -, or n.

So let’s break this down:

Let’s say you’re strolling around Facebook town and you see a picture of a kitten that you just have to share on your wall. The computer would analyze the picture. (Fluffy, baby animal? Pinkish hues? Bright light? Value: Positive.) You added some text with the picture: AWWWWW! (Value: Positive) You get a ton of likes and a few smiley emoticons in the comments section (Positive, positive). Odds are, your day is going pretty sweet.

Mental Health and Social Networks

For someone battling mental health issues like depression or bipolar disorder, this information could really help them get a handle on what’s going on inside their heads. It could also help bridge the gap between a patient and a clinician during a tele-health session, which is a story for another post.

A lot of work went into analyzing that kitten picture. Luo’s been working on sentiment analysis for some time. His algorythm, called Sentribute, does exactly as the name suggests. It uses attributes within an image to judge its sentiment. I won’t go into the minutia of the operations (mostly because I tried really hard but I really don’t understand any of them I’m so sorry) but you can read his paper on it here.

But Luo doesn’t stop there. Not only can you use sentiment analysis to determine the mood of an individual, you can use it to monitor the mood of the country!

I’m not being hyperbolic. Stay with me.

Luo took his fancy-pants algorithm to Flickr and went through images of 2008 presidential candidates. He wanted to see if social (multi)media could be used to predict elections. His theory was that people will post pictures of the candidates they like and will vote for, and therefore the candidate with the greatest online showing would be the victor.

In a sense, each time an image or video is uploaded or viewed, it constitutes an implicit vote for (or against) the subject of the image. This vote carries along with it a rich set of associated data including time and (often) location information. By aggregating such votes across millions of Internet users, we reveal the wisdom that is embedded in social multimedia sites for social science applications such as politics, economics, and marketing.

During this process, he had to correct for things like sacasm and attack ads, which is where the sentiment analysis comes in. Not all pictures of Obama are counted as “votes” per-se, but flattering pictures of him (smiling, head up, arms raised, otherwise stereotypically strong, powerful) would be.

politics

The results? Flickr users correctly predicted the outcome of the primary and general elections of 2008. Sometimes, with super accuracy that I feel couldn’t be recreated if you tried.

An interesting finding: Obama won the presidential election on 11/4/2008 by capturing 52.9% of the popular vote [19]. The TUPD score for Obama and McCain on the same day is 642 (Obama) v.s. 571 (McCain) [F], with Obama got 642/(642 + 571) = 52.9% exactly the same as the real voting result.

You may be asking yourself, Would this experiment have been as accurate if it didn’t feature Barack Obama, “The Social Media President?” But Luo’s research also determined that Flickr photos accurately predicted the Iowa caucuses, the New Hampshire Primary, and the Michigan primary wins for Huckabee, McCain, and Romney, respectively.

On this kind of macro-level, Luo thinks this research can lead to a greater understanding of the wisdom of crowds. He also thinks it can be used to understand economics and marketing in a new way. (His paper includes analysis of Apple product sales through Flickr shares.)

But on a smaller scale, Luo’s research can give us all kinds of insights about the pictures we choose to post and share. Maybe your decision to use a particular Instagram filter is a warning sign that your on the edge of a depressive episode and could warn you to get help before it becomes an issue. It could help us understand how we deal with contagious emotions within our personal networks. As Luo told me, the applications are endless!

Sentiment analysis is kind of blowing my mind right now, in part because recognizing emotions is so embedded in my understanding of the human experience. I’ll categorize this under “turning computers into sentient beings.”

Also, I got through this entire post without stating the obvious “A PICTURE IS WORTH A THOUSAND WORDS” joke (until just now).

V

The Role of Narrative

Zoomorphic Calligraphy

Zoomorphic Calligraphy

It happens when you least expect it. One minute, you’re asking an anti-terrorism expert about global Islamic extremism and Islamophobia in the media. The next, you’re discussing identity and empathy through storytelling, and the importance of individual and cultural narratives. I love my job.

Mark Concordia is a professor of criminal justice at Roberts Wesleyan College.

Total Front Bottoms Obsession

Forget regular old blogging today, I’m experimenting with different stuff because it’s summer and why the hell not?

Paying tribute to one of my favorite musical acts, I made a podcast about the Front Bottoms and how awesome they are.  It’s got some music in it, and me geeking out about a local band with a lot of spunk.  Check it out here.

More to come (maybe?)!

Veronica

TL;DR: Internet Use and “ADD”

Google-internet-006

Where’s my soapbox? Oh, there it is. Ok:

We all have some kind of ADD now. We all need to stop being mean to each other about it. And we all need to find ways to turn it around because it’s a total bummer and I don’t want my generation to turn into a bunch of over-medicated zombie people.

The DSM V gives the behavioral disorder known as Attention Deficit (Hyperactivity) Disorder the following criteria:

  1. Often does not give close attention to details or makes careless mistakes in schoolwork, work, or other activities (e.g., overlooks or misses details, work is inaccurate)
  2. Often has difficulty sustaining attention in tasks or play activities (e.g., has difficulty remaining focused during lectures, conversations, or reading lengthy writings).
  3. Often does not seem to listen when spoken to directly (e.g., mind seems elsewhere, even in the absence of any obvious distraction)
  4. Often does not follow through on instructions and fails to finish schoolwork, chores, or duties in the workplace (e.g., starts tasks but quickly loses focus and is easily sidetracked; fails to finish schoolwork, household chores, or tasks in the workplace)
  5. Often has difficulty organizing tasks and activities (e.g., difficulty managing sequential tasks; difficulty keeping materials and belongings in order; messy, disorganized, work; poor time management; tends to fail to meet deadlines)
  6. Often avoids, dislikes, or is reluctant to engage in tasks that require sustained mental effort (e.g., schoolwork or homework; for older adolescents and adults, preparing reports, completing forms, or reviewing lengthy papers).
  7. Often loses things needed for tasks and activities (e.g., school materials, pencils, books, tools, wallets, keys, paperwork, eyeglasses, or mobile telephones)
  8. Is often easily distracted by extraneous stimuli (for older adolescents and adults, may include unrelated thoughts).
  9. Is often forgetful in daily activities (e.g., chores, running errands; for older adolescents and adults, returning calls, paying bills, keeping appointments)

Of the nine criteria listed above, a patient need display 6 or more for 6 months to be diagnosed with ADD. The New York Times reports a few different statistics on the recent rise of ADD including: 11% of school age children are diagnosed, 1 in 5 high school age boys are diagnosed, etc.

There are a plethora of opinions about why this problem, perceived or real, is on the incline. Some think a reduced stigma in psychotherapy encourages more people to seek help, leading to an influx of diagnoses. Readers of Allen Francis’ Saving Normal might say that drug-pushing Pharmaceutical companies are to blame. It is also not uncommon for those on medication to be accused of faking. And there are a few hippie weirdos out there who point to Food Inc. All of these theories have their own level of credibility (some more than others), but I’m going to ditch them all and make a totally non-credible undergraduate-level hypothesis based on surface-level investigation, Google, personal experience, and one book I read one time. I took Psych 101 in Community College, so I’m going to make a sweeping diagnosis on everyone who will ever read this page. Stay tuned.

Personal Experience

I always thought the people I knew who were diagnosed with Adult ADD were gaming the system; thought they were trying to get one over on their doctors so they could have a leg-up during finals week or write a term paper in one weekend. Once my friends entered the work force, some continued taking ADD meds to set the world on fire, and my opinion shifted. I thought, Why should it be easier for them when it’s so hard for the rest of us? Because, in my own experience, it is hard for the rest of us. I could continue to blame people for exploiting a disorder to function more productively, or I could embrace that maybe we all have it, to varying degrees.

Science!

I’m not good with all the science-y stuff, but luckily I don’t have to be. Nicholas Carr wrote a little book called The Shallows that lays it all out and, besides being a little gratuitously repetitious, it’s a good read. Over-simplification time: The book makes the case that increased exposure to the Internet is fundamentally changing the way our brains work. He explains in detail the theories of plasticity of our neurons and brain patterns, and their susceptibility to change throughout our lives due to the habits we form. In the now out-dated print culture, experience with books and other print media created a population of people capable of more prolonged attention (generally speaking). Reading was good exercise for the brain, like little brainy push-ups, keeping it focused in a linear way on a specific task, and making prolonged attention to other tasks (work, homework, life stuff) easier.

Television came in and stirred the pot a little, but the Internet is what screwed it all up for everybody. In the same way reading trained the brain to pay attention, interacting with the Internet trains the brain to follow tangents. Hyperlinks, multiple tabs, and second screen interaction all contribute to this phenomenon in the same way. If I haven’t lost you yet, maybe you’re beginning to see the problem. Because our minds are reshaping to acclimate to the tools we use (computers, smart phones, etc.), we are suffering from attention problems and memory loss. You can’t hyperlink your way through real life if you want to have successful relationships and careers.

In his article for the Huffington Post, Dr. Mark Goulston says, “When your thinking is interrupted by your brain, you’ve got real ADD. When it’s interrupted by the world, you just have trouble saying ‘no.’” I say nay, Dr. Goulston. More specifically, I say since we’ve trained our brains out of the ability of saying “no,” your distinction is imperceptible. Using words like “pseudo-ADD” does not change the experience of people who live in a world that demands their attention while their mind simultaneously demands distraction.

Patronizing Each Other is Getting Us Nowhere

In researching this post I came across a multitude of articles, not unlike Mark Goulston’s, that take a decidedly condescending tone when talking about “ADD fakers” (although they fail to accurately define the difference between them and their non-faking-counterparts). I was interested, though unsurprised, to see the authors of such articles attributed attention problems to shortcomings on the behalf of the spacey person. Lori Day writes, rather facetiously, “I know, I know, it’s all to be blamed on our fast-paced, screen media-saturated, multi-tasking lifestyles. We can’t help it. There’s just too much going on.” Well, excuse me Miss Day. I didn’t mean to bore you.

My guess is, this harsh attitude is the result of two completely separate viewpoints. Theory Number One: People are frustrated with their own attention deficits, but do not use a label like “ADD” to excuse their behavior and look down on those who they think do so. Theory Number Two: Neuroplasticity may account for the ongoing change of our brains, but lots of habits (good and bad) are formed in childhood. These attention problems may be generational, with younger people facing more obstacles to their productivity because they did not develop more focused habits as kids (they were too busy with other stuff) while their older counterparts may be more disciplined and not-so-understanding.

Get over it.

Stop implying that people are faking having ADD or saying things like pseudo-ADD and listen (look?) up. We need to drop this whole label, non-label nonsense and get to the crux of the issue. Unless we all throw down our digital devices in a show of solidarity (hahahaha), this is the new normal. We need to find ways to combat the attention problems that affect our careers and personal lives, instead of being condescending to each other or worse, taking meds for it. This is my one and only disclaimer for this post: I can’t say that meds don’t help anybody, because maybe they do. But then again, I don’t think I know anybody with a form of ADD that wouldn’t benefit from some alternative treatment.

Concentration and extended focus are important for obvious reasons.  The ability to pay attention leads to productivity at school and in the workplace (read: more $$$).  But there are more important things at stake here.  As a society we are moving rapidly toward a highly digitized world, and shorter attention spans are only the tip of the iceberg.  Institutions founded on the values of print culture are suffering, becoming more obsolete and functioning poorly.  It takes a certain kind of mind to contemplate complex problems like a broken political structure, a failing education system, and unsustainable energy consumption across the globe.  We all need to strive for this certain kind of mind, before we lose ourselves to the distractions completely.

I have a point I promise. I read an awesome article on Lifehacker about rebuilding your attention span. After a few clicks I found a bunch more articles that deal with the same issue. We need to combat the impact the internet is having on our brains – making us inattentive, forgetful, anxious, unmotivated, and unproductive – in order to lead happier, healthier lives. We can do this. But forming healthier habits isn’t going to happen over night, and although I think it’s possible it would definitely require a lot of effort, and…

Oh, look, a video.

Everybody Hates Gatbsy

Wanna-be high-brow media critics across the world wide web are lining up to put their stamp of disapproval on Luhrmann’s Gatsby adaptation.  It has officially become the “cool” thing to do.  But they’re being too harsh.

leonardo-dicaprio-plays-the-strenuously-polished-trove-of-secrets-that-is-jay-gatsby-in-baz

no-rage-faceAllow me to start with this: it is almost impossible for a film like The Great Gatsby to live up to its literary counterpart.  Impossible.  Not that there has never been a successful film adaptation of any book ever, just that it couldn’t be this one. If you’re like me, you read F. Scott Fitzgerald’s novella at a pivotal point in your life (somewhere between 15 and 25, maybe?) and, in doing so, you run the risk of succumbing to nostalgia.

No fantastical cinematic achievement can live up to the emotion we may have felt at glimpsing into the tragic lives of these kindred spirits at a time when the hormones in ourselves were raging out of control.  For me, at least, the idealism, the gluttony, the phoniness, and the heartache all found good company among the inner turmoil I was experiencing for myself as an oppressed, middle-class teenager.  (Sigh. Such hard times, they were.)

Here is what some people are saying:

  • “[Luhrmann’s movies] revel in surface, spectacle and sensory overload. They’re audaciously, passionately artificial and at the same time unabashedly romantic — post-modern pop medleys aimed at the heart, not the brain.” Tom Charity, CNN.com
  • The movie feels bloated, with a few too many scenes of speeding cars careening through the streets and pointless musical segues meant to reflect the carefree attitude of the time.Connie Ogle, The Miami Herald
  • His colors are as bright as those in a detergent commercial; his musical choices as intrusive as the exit cues on an awards show. The camera ducks and swerves like O.J. Simpson on his way to a car rental, and the cast all share a slightly vibratory, methamphetamine sheen.” – Christopher Orr, The Atlantic
There, there, darling. They don't mean it.

There, there, darling. They don’t mean it.

You get the idea.

It appears that the main problem these critics, and many, many others, have with the film is everything that makes it perfect for its medium.  The book is bound to be more introspective and intellectual by the sheer nature of its form.  A movie made in 2013 cannot be blamed for using those tools at its disposal to make it as visually stimulating as possible.  Gatsby definitely becomes a spectacle, with swinging camera shots, dazzling colors, sensational parties, and fantastic wardrobes on beautiful people all shot in 3D with an electrically-charged soundtrack to enhance it.  But instead of hating it for all that it is not, we should celebrate Gatsby it for all that it is.

This is not to say the The Great Gatsby is without flaws.  But I forgive the movie these errors because of its loyalty to the original story and its beautiful delivery.  You should too.  Because, if at times the glitz and glamour all seem a little self-indulgent and ultimately empty, well, now you know how Gatsby must have felt.

“Bros” Before Other Webshows

See what I did there?

Hello, strangers! I have come out of hiding to tell you about this awesome youtube video that you have to watch yesterday:


(I heard about this almost-web series from some staff at school, because Creator/Director Anthony DiMieri went to Fordham – Go Rams! – and we take care of our own.)

This little vid is done surprisingly well, with professional production quality and great comedic timing that you might not expect from a bunch of bros. My favorite part, though, has to be post-transformation (bro + hipster = broster) when they all get out at the Bedford stop on the L. LCD Soundsystem’s “North American Scum” plays through a slow-mo of close-ups on jorts, STOP KONY tees, suspenders and Buddy Holly glasses. I died.

Although it’s meant to be a “Girls” parody (the first scene should be familiar to Dunham’s loyal fanbase), it’s also a parody of a few social stereotypes.  The “Bros” demonstrate blatant homophobia and misogyny, while their hip alter-egos talk about basement gin distilleries and warehouse parties. It’s the kind of funny you find when nobody is safe from sarcastic mockery and some playful generalizing. Kudos.