By Quentin Fottrell
Published: Nov. 12, 2019, 10:31 a.m. EDT
https://apple.news/AZ7WBqco0Rsq0g3eINpWCXw
"New research appears to validate what many parents and educators have long suspected
Instead of updating your status, why not improve your exam results?
Students whose grades are below average could boost their results if they devoted less time to Facebook and other social networking sites, according to research published Tuesday. The study, led by James Wakefield, a senior lecturer at the University of Technology Sydney, examined the time first-year university students spent on Facebook (FB) and how it impacted their grades.
Such students are likely already struggling with their ability to focus. “Time spent on social networking platforms puts lower academic achievers at higher risk of failing their course,” Wakefield said. “We found that if they used Facebook for three hours a day — not substantially higher than the average of just under two hours — the difference was around six marks in a 60 mark exam or 10%.”
The research, published in the latest edition of Computers & Education, a peer-reviewed journal, with co-author Jessica Frawley, a lecturer at the University of Sydney, looked at university students studying STEM — science, technology, engineering and mathematics — and business degrees, it is likely to also be relevant to high school students who use social media.”
More than 500 students enrolled in a first-year class, “Introductory Accounting,” at an Australian university took part in the study; they had an average age of 19. The researchers controlled for other factors that might influence their achievement, including whether they were planning to major in accounting, as well as their age and gender.
Facebook CEO Mark Zuckerberg faced a grilling on Capitol Hill last month by members of the House Financial Services Committee over his proposed cryptocurrency project Libra and the platform’s role in the 2016 U.S. presidential elections. The #DeleteFacebook hashtag was trending on Twitter (TWTR) in the wake of that controversy.
Many consumers vowed to deactivate their accounts in 2018 after revelations that U.K.-based campaign strategy firm Cambridge Analytica used millions of Facebook users’ personal data without their permission. Some 44% of users between the ages of 18 and 29 deleted the Facebook app from their phone in the wake of the scandal, according to a survey by Pew Research Center.
In the aftermath of that scandal, all Facebook users received a message on Facebook called “Protecting Your Information,” laying out which third-party apps have access to your individual Facebook profile. (Zuckerberg also issued a mea culpa, and pledged to be more careful when vetting third party apps, but said fixing the problem could take years.)
Last month, #DeleteFacebook was trending once more after a report from Politico that Zuckerberg held private meetings with conservative journalists and commentators over the summer. Despite these recent controversies, Facebook reported an 1.6% increase in active users in the third quarter from the previous year, bringing the global total to 2.45 billion monthly active users."
Healthcare & Big Data (pdf)
DownloadThe Challenger space shuttle exploded shortly after liftoff, killing seven people. The disaster took place decades ago and was not caused by a virus. Nonetheless, it holds lessons for how we can make the best possible choices as we open and close during the pandemic.
Analogies are always imperfect (they limp along, as the saying goes), but they can help us know what to consider before making big decisions. As you read the following summary of the Challenger tragedy, please consider the too-risky choices that led to the deaths of those seven astronauts and the lessons those flawed judgments offer. Most vital and urgent, apply them to your future decisions about COVID-19.
Countless Decisions Everywhere, Good Ones Required
The October 2019 Global Health Security Index ranked the United States and the United Kingdom the two best prepared for a pandemic of all 195 countries ranked. By June 2020, those countries were again #1 and #2, but this time the criterion was “excess deaths” compared with the number expected during no pandemic.
Terrible decisions are causing these horrifying results. As the pandemic continues to challenge us, we need as many people as possible to make as many good and as few bad choices as possible about:
With countless choices to make, most of us will make some good decisions and some bad ones. Making those decisions thoughtfully, as opposed to instinctively, is key to making as few mistakes as possible.
The Challenger Disaster
The essential lessons for decision-making from the Challenger disaster are psychological and behavioral. At Morton Thiokol, the most relevant contractor, some engineers and managers had worried for years about O-ring defects. Early on, engineers discovered evidence of flaws in the rubber gaskets that sealed one of the rocket's boosters. Still, the problem didn’t seem too severe to decision-makers, who deemed the risk acceptable.
Eventually, as O-ring problems appeared in seven of the nine 1985 shuttle launches, engineers and some managers came to believe strongly that O-ring failures could prove deadly. Further evidence convinced them that the most severe danger came in the coldest temperatures. Evidence kept mounting, and concerns kept growing. Engineers began a process of redesign. But the engineers’ warnings weren’t heeded by NASA officials.
On the day of the Challenger launch, the temperature was below freezing, far lower than at any previous launch. Ice all over the launching pad caused serious concern. An ice team worked all night to remove it, and the mission manager in Houston postponed the launch by an hour. The added time would allow the temperature to rise a bit, and the ice team could inspect Challenger again and clear it for launch.
The Morton Thiokol engineers were terrified, and their managers recommended that NASA—the customer—not launch. But NASA managers were under time pressure. They argued against cancellation, violated several mission rules, and decided to launch. Seventy-three seconds after liftoff, the Challenger exploded.
Lessons for Our Pandemic Choices
Every analogy is imperfect, but the Challenger offers vital lessons for making pandemic-related decisions:
Recognize your all-too-human biases. When confronting risk and uncertainty, we rarely make decisions that are as good as we think. For instance, the danger repeatedly described by Thiokol engineers and their managers seemed to be a more acceptable risk to NASA decision-makers than failure to meet the schedule.
I must acknowledge the advantage of 20/20 hindsight. The point is not about the NASA decision-makers so much as tendencies we all fall prey to. We often care more about speed than quality, which leads us to make fast decisions rather than good ones. A prime example as regards COVID-19 was opening nonessential businesses prematurely.
Rely not on instinct but good thinking. We make many decisions based on automatic, thoughtless, fast-but-biased System 1 information processing. In contrast, System 2 processing takes more time and effort, is deliberative and thoughtful, and is likely to result in more rational choices, offering the greatest net benefit or least harm.
Instinct can work, but mainly if you have deep expertise and familiarity with the problem you face. More often than not, relying on instinct without slowing down to seek information and think results in less-than-optimal results—in the case of COVID-19, unnecessary illness and avoidable deaths.
Consider the consequences of your instinctive choice and its opposite. Rationality requires thinking beyond your natural preference—for instance, attending a fun event—and considering at least one alternative (not going). It also requires thinking through the pros and cons of both options.
Crucially, some costs and benefits are immediate while others are long-term, and the differences can be immense. All else equal, immediate consequences influence us more than what might come later, presenting a conflict between what we want to do (now) versus what we should do (for the eventual best). We give in to temptation (overeating, overdrinking, skipping a workout or a class) because we enjoy that activity at the moment, but we regret the consequences later. “Want” versus “should” helps explain many unsafe pandemic choices.
Consider ethical implications. Assessing benefits and risks can be a purely selfish calculus but does not need to be limited to concern for oneself. Ethics is a personal choice about whether and how to weigh the effects of your actions on others.
In addition to the short-term pleasure of choosing “want” over “should,” a person’s values can explain their willingness to take seemingly ill-advised risks. Defying safety recommendations sometimes stems from valuing individual freedom and not wanting others to tell us what to do.
We all value our freedoms; we all value our lives. What makes decisions with ethical implications difficult is that they require prioritizing one value over others. With pandemic decisions, free choice takes priority for some people, while a desire not to harm family, friends, and even strangers takes precedence for others.
Beat the false choice with third options. Even when feeling caught between a rock and a hard place, you usually can find more than two options. In the heat of the moment, the Challenger options seemed to be to launch or to cancel. But a viable and safe third option—possibly not even considered after the first delay—was to wait for a warmer day.
Sadly, some officials view the pandemic as a string of zero-sum, economy-versus-health decisions. But deciding when to go back to work—for instance, when your boss demands your return, and you need the money but are wary—might include additional options, such as allying with coworkers to initiate safer practices and negotiating best paths forward with the boss.
Framing a false choice of health versus economy already has caused big mistakes and will cause more; the two options are not separate and are not either/or. Helping the economy was an intended benefit of reopening businesses, but the early reopening hurt public health, which then caused people to be wary of returning to stores and restaurants. We must solve the pandemic to bring back long-term prosperity; opening too soon, hiding pandemic facts, fudging the data, and conveying false optimism avoid the real dangers of the pandemic and short-circuit the economic recovery.
Know when to change course. NASA had put a lot of resources—sunk costs—into the launch, making it difficult psychologically to cancel. Public officials who reopened too early made mistakes, but some heeded the worsening virus data and reversed course sooner rather than later.
Beware overconfidence. More often than not, confidence is beneficial. But we all know that overconfidence and cockiness can cause bad decisions—not every time, but enough to suggest that you stop and reconsider before taking a dangerous plunge.
Extreme overconfidence is a feature of narcissism. Most decision-makers are not narcissists, but all should be wary of taking advice from one. (Ed. and dysfunctional sociopaths)
Don’t let hope block action. The Challenger decision-makers hoped the O-ring problem would go away. They hoped the engineers were wrong, and the redesign efforts would work. They hoped the ambient temperature might be high enough. We all have analogous hopes about the virus—including some public officials’’ fantasies, stated and repeated with apparent confidence that some listeners buy.
Hope is not a plan, a decision, an action, or a solution. Effective action is the ultimate superpower.
Weigh data properly. In the Challenger story, the decision-makers weighted heavily the recent string of accident-free launches and the morning’s temperature increase thanks to the postponement. The warmer temperature and previous launch successes were top-of-mind: memorable, salient, and useful for driving and defending the launch decision.
One angle on this is that we cherry-pick data, which helps us rationalize our decisions with inadequate, often psychological, defenses.
Avoid the Not Invented Here (NIH) bias. We value our own opinions and ideas more than those of other people. The NASA decision-makers did not accept urgent, informed advice from Thiokol engineers and managers. NIH is an acronym worth remembering, so you don't reject the best ideas simply because they weren’t your own or don’t fit with your instinctive biases.
That’s not a personal insult. NIH is a common human bias, like the others above. With knowledge and effort, we can override such preferences, make better choices, and obtain better outcomes.
While progressing, suppress the backslide. We made progress against COVID-19 and then backslid. We stopped being vigilant and, when we didn't immediately get sick ourselves, took more risks. As the axiom goes, two steps forward, one step back.
Moral licensing is when we give ourselves credit for doing the right thing and then slacken. This will hurt us badly in our battle against the virus. Without vigilance and a dose of grit, progress begets backsliding.
Conclusions
The Challenger decision-makers did not adequately weigh the dangers. Thinking long and hard before making a choice that doesn’t work out due to risky, uncertain times is understandable. So is making a mistake on a close call without helpful guidance from public officials. But not taking a pandemic seriously is inexcusable.
Read The Challenger Launch Decision by Diane Vaughan for a more thorough and authoritative account of the errors and biases that led to that disaster. For updates on the pandemic and the situation where you live, read and listen to the best informed and most thoughtful government executives, health experts, and community leaders.
Biases affect decision-makers and citizens everywhere. Make the most rational, safest choices for you, your friends, and family members, and people you don’t even know. Don’t forget the Challenger. We are all decision-makers.
Online: Visit Faculty Page, Twitter