Tuesday, November 17, 2015

ARGUMENTATIVE WRITING ASSIGNMENT "SHUT DOWN YOUR SCREEN WEEK"

Grades 6-12, Prompt for Argument Writing
Common Core Standard W.CCR.1

A group of parents and teachers in your school have made a proposal to the school board. In their proposal, they are suggesting that the school join in a national movement called “Shut Down Your Screen Week.” The parents and teachers in the group believe that not using any electronic media for an entire week would be good for students for many reasons.

They have taken the proposal to a teachers’ meeting, so that teachers can discuss the issue of whether or not to ask their students to participate in the “Shut Down Your Screen Week.” The teachers have decided they would like to hear from the students before they decide.

This is not a simple issue, so you need to think very carefully about it. You have three texts to read relating to the issue: “Social Media as Community,” “Is Google Making Us Stupid?” and  “Attached to Technology and Paying a Price.” As you read and re-read these texts, think about what they show you about the issue. Think about what position you will take and what evidence you will use to support your thinking.

Finally, write an essay, in the form of a letter to the teachers, explaining your thinking.

For the essay, your Focusing Question is:

Should your school participate in the national “Shut Down Your Screen Week?”  Be sure to use evidence from the texts, as well as your own knowledge, to support and develop your thinking.

Remember, a strong and effective piece of argument writing:
·       Takes the audience into account
·       Has a clear introduction
·       States a focus/position statement clearly, precisely, and thoughtfully
·       Uses specific evidence from the text(s) to support and develop the position, and explains that evidence logically
·       Takes into account what people who disagree with you might think and tries to respond to that
·       Concludes effectively
·       Uses precise language
·       Shows control over conventions

You will have three class periods to complete this reading/thinking/writing task. The essay will have a single draft, and you may want to take some time to plan your writing before you begin work. When you have finished, be sure to proofread.




Social Media as Community
By Keith Hampton
Keith Hampton is an associate professor in the School of Communication and Information at Rutgers, and a past chairman of the American Sociological Association’s section on Communication and Information Technologies. 
Updated June 18, 2012 New York Times / Opinion Pages Excerpt
Neither living alone nor using social media is socially isolating. In 2011, I was lead author of an article in Information, Communication & Society that found, based on a representative survey of 2,500 Americans, that regardless of whether the participants were married or single, those who used social media had more close confidants.
The constant feed from our online social circles is the modern front porch.
A recent follow-up study, “Social Networking Sites and Our Lives” (Pew Research Center), found that the average user of a social networking site had more close ties than and was half as likely to be socially isolated as the average American. Additionally, my co-authors and I, in another article published in New Media & Society, found not only that social media users knew people from a greater variety of backgrounds, but also that much of this diversity was a result of people using these technologies who simultaneously spent an impressive amount of time socializing outside of the house.
A number of studies, including my own and those of Matthew Brashears (a sociologist at Cornell), have found that Americans have fewer intimate relationships today than 20 years ago. However, a loss of close friends does not mean a loss of support. Because of cellphones and social media, those we depend on are more accessible today than at any point since we lived in small, village-like settlements.
Social media has made every relationship persistent and pervasive. We no longer lose social ties over our lives; we have Facebook friends forever. The constant feed of status updates and digital photos from our online social circles is the modern front porch. This is why, in “Social Networking Sites and Our Lives,” there was a clear trend for those who used these technologies to receive more social support than other people.
The data backs it up. There is little evidence that social media is responsible for a trend of isolation, or a loss of intimacy and social support.
Used by permission of New York Times.

Is Google Making Us Stupid?

YES
Who doesn't love Google? In the blink of an eye, the search engine delivers useful information about pretty much any subject imaginable. I use it all the time, and I'm guessing you do too.
But I worry about what Google is doing to our brains. What really makes us intelligent isn't our ability to find lots of information quickly. It's our ability to think deeply about that information. And deep thinking, brain scientists have discovered, happens only when our minds are calm and attentive. The greater our concentration, the richer our thoughts.
If we're distracted, we understand less, remember less, and learn less.
That's the problem with Google—and with the Internet in general. When we use our computers and our cellphones all the time, we're always distracted.
The Net bombards us with messages and other bits of data, and every one of those interruptions breaks our train of thought. We end up scatterbrained. The fact is, you'll never think deeply if you're always Googling, texting, and surfing.
Google doesn't want us to slow down. The faster we zip across the Web, clicking links and skimming words and pictures, the more ads Google is able to show us and the more money it makes. So even as Google is giving us all that useful information, it's also encouraging us to think superficially. It's making us shallow.
If you're really interested in developing your mind, you should turn off your computer and your cellphone—and start thinking. Really thinking. You can Google all the facts you want, but you'll never Google your way to brilliance.
Nicholas Carr, Author
The Shallows: What the Internet Is Doing to Our Brains

NO
Any new information technology has both advocates and critics. More than 2,000 years ago, the classical Greek philosopher Socrates complained that the new technology of writing "will create forgetfulness in the learners' souls because they will not use their memories."
Today, Google is the new technology. The Internet contains the world's best writing, images, and ideas; Google lets us find the relevant pieces instantly.
Suppose I'm interested in the guidance computers on Apollo spacecraft in the 1960s. My local library has no books on that specific subject—just 18 books about the Apollo missions in general. I could hunt through those or turn to Google, which returns 45,000 pages, including a definitive encyclopedia article and instructions for building a unit.
Just as a car allows us to move faster and a telescope lets us see farther, access to the Internet's information lets us think better and faster. By considering a wide range of information, we can arrive at more creative and informed solutions. Internet users are more likely to be exposed to a diversity of ideas. In politics, for example, they are likely to see ideas from left and right, and see how news is reported in other countries.
There's no doubt the Internet can create distractions. But 81 percent of experts polled by the Pew Internet Research Project say the opportunities outweigh the distractions.
Socrates was wrong to fear the coming of the written word: Writing has improved our law, science, arts, culture, and our memory. When the history of our current age is written, it will say that Google has made us smarter—both individually and collectively—because we have ready and free access to information.
Peter Norvig, Director of Research
Google Inc.
Used by permission of (The New York Times Upfront, Vol. 143, October 4, 2010)





Attached to Technology and Paying a Price
By MATT RICHTEL New York Times June 6, 2010
SAN FRANCISCO — When one of the most important e-mail messages of his life landed in his in-box a few years ago, Kord Campbell overlooked it.  Not just for a day or two, but 12 days. He finally saw it while sifting through old messages: a big company wanted to buy his Internet start-up.
The message had slipped by him amid an electronic flood: two computer screens alive with e-mail, instant messages, online chats, a Web browser and the computer code he was writing.  While he managed to salvage the $1.3 million deal after apologizing to his suitor, Mr. Campbell continues to struggle with the effects of the deluge of data. Even after he unplugs, he craves the stimulation he gets from his electronic gadgets. He forgets things like dinner plans, and he has trouble focusing on his family.
This is your brain on computers.
Scientists say juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.  These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive. In its absence, people feel bored.
The resulting distractions can have deadly consequences, as when cellphone-wielding drivers and train engineers cause wrecks. And for millions of people like Mr. Campbell, these urges can inflict nicks and cuts on creativity and deep thought, interrupting work and family life.
While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress.  And scientists are discovering that even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers.
“The technology is rewiring our brains,” said Nora Volkow, director of the National Institute of Drug Abuse and one of the world’s leading brain scientists. She and other researchers compare the lure of digital stimulation less to that of drugs and alcohol than to food and sex, which are essential but counterproductive in excess.
Technology use can benefit the brain in some ways, researchers say. Imaging studies show the brains of Internet users become more efficient at finding information. And players of some video games develop better visual acuity.
More broadly, cellphones and computers have transformed life. They let people escape their cubicles and work anywhere. They shrink distances and handle countless mundane tasks, freeing up time for more exciting pursuits.
For better or worse, the consumption of media, as varied as e-mail and TV, has exploded. In 2008, people consumed three times as much information each day as they did in 1960. And they are constantly shifting their attention. Computer users at work change windows or check e-mail or other programs nearly 37 times an hour, new research shows.
The nonstop interactivity is one of the most significant shifts ever in the human environment, said Adam Gazzaley, a neuroscientist at the University of California, San Francisco.
“We are exposing our brains to an environment and asking them to do things we weren’t necessarily evolved to do,” he said. “We know already there are consequences.”
Mr. Campbell, 43, came of age with the personal computer, and he is a heavier user of technology than most. But researchers say the habits and struggles of Mr. Campbell and his family typify what many experience — and what many more will, if trends continue.  For him, the tensions feel increasingly acute, and the effects harder to shake.
Always On
Mr. Campbell, whose given name is Thomas, had an early start with technology in Oklahoma City. When he was in third grade, his parents bought him Pong, a video game. Then came a string of game consoles and PCs, which he learned to program.
Mr. Campbell loves the rush of modern life and keeping up with the latest information. “I want to be the first to hear when the aliens land,” he said, laughing. But other times, he fantasizes about living in pioneer days when things moved more slowly: “I can’t keep everything in my head.”
No wonder. As he came of age, so did a new era of data and communication.  At home, people consume 12 hours of media a day on average, when an hour spent with, say, the Internet and TV simultaneously counts as two hours. That compares with five hours in 1960, say researchers at the University of California, San Diego. Computer users visit an average of 40 Web sites a day, according to research by RescueTime, which offers time-management tools.
As computers have changed, so has the understanding of the human brain. Until 15 years ago, scientists thought the brain stopped developing after childhood. Now they understand that its neural networks continue to develop, influenced by things like learning skills.
So not long after Eyal Ophir arrived at Stanford in 2004, he wondered whether heavy multitasking might be leading to changes in a characteristic of the brain long thought immutable: that humans can process only a single stream of information at a time. He was startled by what he discovered.

Used by permission of New York Times

The Myth of Multitasking
The test subjects were divided into two groups: those classified as heavy multitaskers based on their answers to questions about how they used technology, and those who were not.
In a test created by Mr. Ophir and his colleagues, subjects at a computer were briefly shown an image of red rectangles. Then they saw a similar image and were asked whether any of the rectangles had moved. It was a simple task until the addition of a twist: blue rectangles were added, and the subjects were told to ignore them.
The multitaskers then did a significantly worse job than the non-multitaskers at recognizing whether red rectangles had changed position. In other words, they had trouble filtering out the blue ones — the irrelevant information.
So, too, the multitaskers took longer than non-multitaskers to switch among tasks, like differentiating vowels from consonants and then odd from even numbers. The multitaskers were shown to be less efficient at juggling problems. Other tests at Stanford, an important center for research in this fast-growing field, showed multitaskers tended to search for new information rather than accept a reward for putting older, more valuable information to work.
Researchers say these findings point to an interesting dynamic: multitaskers seem more sensitive than non-multitaskers to incoming information.
The results also illustrate an age-old conflict in the brain, one that technology may be intensifying. A portion of the brain acts as a control tower, helping a person focus and set priorities. More primitive parts of the brain, like those that process sight and sound, demand that it pay attention to new information, bombarding the control tower when they are stimulated.
Researchers say there is an evolutionary rationale for the pressure this barrage puts on the brain. The lower-brain functions alert humans to danger, like a nearby lion, overriding goals like building a hut. In the modern world, the chime of incoming e-mail can override the goal of writing a business plan or playing catch with the children.
“Throughout evolutionary history, a big surprise would get everyone’s brain thinking,” said Clifford Nass, a communications professor at Stanford. “But we’ve got a large and growing group of people who think the slightest hint that something interesting might be going on is like catnip. They can’t ignore it.”
Melina Uncapher, a neurobiologist on the Stanford team, said she and other researchers were unsure whether the muddied multitaskers were simply prone to distraction and would have had trouble focusing in any era. But she added that the idea that information overload causes distraction was supported by more and more research.
A study at the University of California, Irvine, found that people interrupted by e-mail reported significantly increased stress compared with those left to focus. Stress hormones have been shown to reduce short-term memory, said Gary Small, a psychiatrist at the University of California, Los Angeles.
Preliminary research shows some people can more easily juggle multiple information streams. These “supertaskers” represent less than 3 percent of the population, according to scientists at the University of Utah.
Other research shows computer use has neurological advantages. In imaging studies, Dr. Small observed that Internet users showed greater brain activity than nonusers, suggesting they were growing their neural circuitry.
At the University of Rochester, researchers found that players of some fast-paced video games can track the movement of a third more objects on a screen than nonplayers. They say the games can improve reaction and the ability to pick out details amid clutter.
“In a sense, those games have a very strong both rehabilitative and educational power,” said the lead researcher, Daphne Bavelier, who is working with others in the field to channel these changes into real-world benefits like safer driving.
There is a vibrant debate among scientists over whether technology’s influence on behavior and the brain is good or bad, and how significant it is.  Mr. Ophir is loath to call the cognitive changes bad or good, though the impact on analysis and creativity worries him.
The Toll on Children
The Campbells, father and son, sit in armchairs. Controllers in hand, they engage in a fierce video game battle, displayed on the nearby flat-panel TV, as Lily watches.
They are playing Super Smash Bros. Brawl, a cartoonish animated fight between characters that battle using anvils, explosives and other weapons.
“Kill him, Dad,” Lily screams. To no avail. Connor regularly beats his father, prompting expletives and, once, a thrown pillow. But there is bonding and mutual respect.
Screens big and small are central to the Campbell family’s leisure time. Connor and his mother relax while watching TV shows like “Heroes.” Lily has an iPod Touch, a portable DVD player and her own laptop, which she uses to watch videos, listen to music and play games.
Lily, a second-grader, is allowed only an hour a day of unstructured time, which she often spends with her devices. The laptop can consume her.
“When she’s on it, you can holler her name all day and she won’t hear,” Mrs. Campbell said.
Researchers worry that constant digital stimulation like this creates attention problems for children with brains that are still developing, who already struggle to set priorities and resist impulses.
Connor’s troubles started late last year. He could not focus on homework. No wonder, perhaps. On his bedroom desk sit two monitors, one with his music collection, one with Facebook and Reddit, a social site with news links that he and his father love. His iPhone availed him to relentless texting with his girlfriend.
When he studied, “a little voice would be saying, ‘Look up’ at the computer, and I’d look up,” Connor said. “Normally, I’d say I want to only read for a few minutes, but I’d search every corner of Reddit and then check Facebook.”
His Web browsing informs him. “He’s a fact hound,” Mr. Campbell brags. “Connor is, other than programming, extremely technical. He’s 100 percent Internet savvy.”
No Vacations
For spring break, the family rented a cottage in Carmel, Calif. Mrs. Campbell hoped everyone would unplug. But the day before they left, the iPad from Apple came out, and Mr. Campbell snapped one up. The next night, their first on vacation, “We didn’t go out to dinner,” Mrs. Campbell mourned. “We just sat there on our devices.”
She rallied the troops the next day to the aquarium. Her husband joined them for a bit but then begged out to do e-mail on his phone.  Later she found him playing video games.
On Thursday, their fourth day in Carmel, Mr. Campbell spent the day at the beach with his family. They flew a kite and played whiffle ball.  Connor unplugged too. “It changes the mood of everything when everybody is present,” Mrs. Campbell said.  The next day, the family drove home, and Mr. Campbell disappeared into his office.
Mr. Nass at Stanford thinks the ultimate risk of heavy technology use is that it diminishes empathy by limiting how much people engage with one another, even in the same room.
“The way we become more human is by paying attention to each other,” he said. “It shows how much you care.”
That empathy, Mr. Nass said, is essential to the human condition. “We are at an inflection point,” he said. “A significant fraction of people’s experiences are now fragmented.”