Showing posts with label study. Show all posts
Showing posts with label study. Show all posts

Tuesday, August 9, 2011

Facebook Study: Bad Students Chat, Good Ones RSVP [EXCLUSIVE]

by 

In the debate over whether social media has a positive or negative effect on education, a new study to be published in Computers & Education has made a refreshing suggestion: Neither. It depends how you use it.
The survey of 2,368 university students looked at how 14 specific behaviors on Facebook – commenting on content, playing games, posting photos — correlated with student engagement on campus and time spent studying. It found that specific behaviors on Facebook were stronger predictors of these types of academic outcomes than measurements like time spent on Facebook.
Playing games on Facebook, for instance, correlated with low scores on a 19-question version of the National Survey of Student Engagement, which measures both participation in campus activities and in the classroom. Creating or RSVPing to Facebook events, on the other hand, correlated with high scores on the same assessment. And although the study found no relationship between time spent on Facebook in general and time spent studying, it did find a negative correlation between Facebook chatting and time spent studying.
This doesn’t necessarily mean that banning Facebook chat will solve a student’s studying challenges.
“We can’t tell the direction of that correlation,” explains Reynol Junco, the author of the study. “Do [some Facebook activities] cause more involvement or does the involvement cause more Facebook?”
More clear is that how Facebook is used, rather than how much, is important in understanding its relationship to education.
“There are still a lot of faculty who feel students using Facebook is bad,” Junco says. “And there are clearly data that show that yes, there are some ways in which it is not good…[But faculty] should be thinking about ways, if not using Facebook in our classes, of helping students use Facebook in some ways that may be more beneficial to their academic outcomes.”

Friday, July 29, 2011

iPad To Dominate Tablet Market Until 2015 [STUDY]

by 

The iPad will continue to dominate the growing tablet market until Android tablets take over in 2015, according to forecasts from Informa Telecoms & Media.
The study estimates that Apple currently owns about 75% of the tablet market, which is expected to expand from less than 20 million units in 2010 to more than 230 million in 2015. By that time, Apple’s share of the market will drop to just 38%, due largely to the proliferation of cheaper and more advanced Android tablets. Still, it will take another year — until 2016 — for Android tablets to outsell Apple ones.
RIM’s prospects for market expansion are less optimistic, the study finds. Once the PlayBook begins supporting Android apps and 4G connectivity — HSPA+ and LTE versions are expected before the end of 2011 — sales are expected to improve.
Other analysts are more bullish on Apple’s market dominance. A study released by Gartner in April predicted the iPad will maintain a 47% share of the tablet by 2015, with Android hovering just above 38%.
Who will win the tablet wars, and how long will it take? Let us know your thoughts in the comments.

New Study: Cellphones Don’t Cause Cancer

 by 

In the battle of the cellphone cancer studies, now we can chalk one up for the scientists who say there is no relationship between cellphones and cancer in children.
A 1,000-participant study published in the Journal of the National Cancer Institute researched the effects of cellphones on children and adolescents, comparing cellphone usage of a group diagnosed with brain tumors against a control group of cellphone-using individuals who were in good health.
The result? “The absence of an exposure-response relationship either in terms of the amount of mobile phone use or by localization of the brain tumor argues against a causal association,” concludes the study abstract.
This is the latest in a long line of extensive research aimed at finding the truth about whether cellphones cause cancer or not. The three most recent studies:
  • The World Health Organization’s International Agency for Research on Cancer (IARC) announced the results of its cellphones/cancer study in May of this year, calling cellphones “possibly carcinogenic to humans.”
  • study published in February found that cellphone use caused increased activity in certain parts of the brain, but couldn’t determine if those effects were causing any harm, or even if they were beneficial.
  • Last year, a less-credible study that was partially funded by the wireless industry found no evidence of increased risk of brain tumors associated with mobile phones. But the scientists behind that study acknowledged that the results weren’t definitive.
What’s a cellphone user to do? If you’re still worried about an unknown/unseen mechanism at work here, just join the teens of the world and text a lot more than you talk on your cellphone.

Tuesday, July 19, 2011

Are social media creating the laziest generation?

(CNN) -- We are the reality-show generation. Instead of doing, we watch: We watch people sing, dance with B-level stars, fist pump, pawn stuff, pick a husband/wife, get extreme makeovers to their homes and faces, be "real" housewives, keep up, lose weight, go to rehab, get fired, survive.

And the voyeuristic nature has spilled into everyday life. Thanks to Facebook, Twitter, Google+, YouTube and a host of other social media platforms, all the world's now truly a stage, and we are all players in the reality show of life -- either as the "stars" or as the self-appointed pundits.

Observing events and then commenting about them on social media has become our national religion. We anxiously wait for the next celebrity to screw up, another politician to be caught in a sex scandal, the verdict in an even higher profile murder trial or simply a friend to do something stupid so that we can quickly begin worshipping at the altar of the social media platform of our choice to offer our (or read others') opinions, jokes, jibes and the occasional insightful thought.

In the past, people would recount where they were when an historic event occurred such as the Kennedy assassination, the space shuttle exploding or the 9/11 attacks. In the future, we will instead recall what we tweeted, posted or read on social media platforms about such events.

Yes, it's fun, and I'm guilty of doing the same thing. But here is my growing concern: Are we becoming the laziest generation?

Is social media becoming our opiate of the masses seducing us into being slacktivists, believing that simply because we make a cyber comment, we are somehow actually affecting our world? Will our generation leave a lasting legacy or just millions of snarky tweets?

Look at the preceeding generations: In the 1940s and '50s, there was the "Greatest Generation," a generation of doers, not watchers, who through their dedication, work ethic and sacrifice, built our nation into an economic superpower.

They were followed in the '60s and early '70s by a generation that took to the streets to oppose the Vietnam War and press for civil rights, causing American policy to change on both the foreign policy and domestic fronts.

In contrast to their activism, many of us are only engaged in slacktivism: clicking "Like" on Facebook, digitally signing an online petition or retweeting someone else's thoughts on Twitter.

That is a good start, but it will take more than that to cause meaningful change. We need to look no further than the recent democratic revolutions in the Arab world for guidance. Protesters there utilized social media, but they didn't just post comments on people's Facebook walls and sit back; they then took to the streets and risked life and limb to effectuate change. All the tweets in the world would not have driven the presidents of Egypt or Tunisia from their offices.

What is more likely to get our generation into the streets to protest: a political issue or Facebook imposing a fee to use it?

I know that in today's increasingly complex and challenging world it seems that one individual can't have an impact on the issues facing our country or our planet, but you can, and if you choose to, you will.

As Robert F. Kennedy's inspirational yet realistic statement tells us: "Few will have the greatness to bend history itself; but each of us can work to change a small portion of events, and in the total of all those acts will be written the history of this generation."

I'm not in any way advocating that we stopping using social media -- in fact, please follow me on Twitter or add me on Facebook andGoogle+ -- but if there is an issue you really want to make a difference on, it will take more than a tweet of 140 characters or updating your Facebook status to do it.

Monday, July 18, 2011

Internet's memory effects quantified in computer study

By Jason Palmer

Computers and the internet are changing the nature of our memory, research in the journal Science suggests.
Psychology experiments showed that people presented with difficult questions began to think of computers.
When participants knew that facts would be available on a computer later, they had poor recall of answers but enhanced recall of where they were stored.
The researchers say the internet acts as a "transactive memory" that we depend upon to remember for us.
Lead author Betsy Sparrow of Columbia University said that transactive memory "is an idea that there are external memory sources - really storage places that exist in other people".
"There are people who are experts in certain things and we allow them to be, [to] make them responsible for certain kinds of information," she explained to BBC News.
Co-author of the paper Daniel Wegner, now at Harvard University, first proposed the transactive memory concept in a book chapter titledCognitive Interdependence in Close Relationships, finding that long-term couples relied on each other to act as one another's memory banks.
"I really think the internet has become a form of this transactive memory, and I wanted to test it," said Dr Sparrow.
Where, not what
The first part of the team's research was to test whether subjects were "primed" to think about computers and the internet when presented with difficult questions. To do that, the team used what is known as a modified Stroop test.
The standard Stroop test measures how long it takes a participant to read a colour word when the word itself is a different colour - for example, the word "green" written in blue.

Start Quote

I don't think Google is making us stupid - we're just changing the way that we're remembering things…”
Dr Betsy SparrowColumbia University
Reaction times increase when, instead of colour words, participants are asked to read words about topics they may already be thinking about.
In this way the team showed that, after presenting subjects with tough true/false questions, reaction times to internet-related terms were markedly longer, suggesting that when participants did not know the answer, they were already considering the idea of obtaining it using a computer.
A more telling experiment provided a stream of facts to participants, with half told to file them away in a number of "folders" on a computer, and half told that the facts would be erased.
When asked to remember the facts, those who knew the information would not be available later performed significantly better than those who filed the information away.
But those who expected the information would be available were remarkably good at remembering in which folder they had stored the information.
"This suggests that for the things we can find online, we tend keep it online as far as memory is concerned - we keep it externally stored," Dr Sparrow said.
She explained that the propensity of participants to remember the location of the information, rather than the information itself, is a sign that people are not becoming less able to remember things, but simply organising vast amounts of available information in a more accessible way.
"I don't think Google is making us stupid - we're just changing the way that we're remembering things... If you can find stuff online even while you're walking down the street these days, then the skill to have, the thing to remember, is where to go to find the information. It's just like it would be with people - the skill to have is to remember who to go see about [particular topics]."

Monday, July 11, 2011

More U.S. Adults Own a Smartphone Than Have a Degree

by 

35% of American Adults Own a SmartphoneMore Americans own Smartphones than hold a bachelor’s degree or speak another language in their homes, according to aPew Internet Project report released Monday.
In a telephone survey, 83% of respondents said that they owned a cellphone of some kind and 35% of the 2,277 U.S. adults questioned in English or Spanish said that they owned a smartphone.
Not surprisingly, wealthy, well-educated and young respondents all had high levels of smartphone ownership. More interestingly, African-Americans and Latinos in the survey were also more likely to own smartphones than whites. But just about everyone who owned a smartphone was likely to use that phone to access the Internet.
Nine in 10 smartphone owners (87%) used their phones as Internet portals — about 78% of them did so every day. Nearly a third of smartphone owners use their device as their primary Internet connection.
With so many people relying on their phones for both verbal and digital communication, it’s no wonder the word cloud the researchers compiled to show respondents’ feelings toward their cellphones includes words like “necessary,” “convenient” and even, perhaps somewhat disturbingly, “love.”



Smartphone word cloud