READING PASSAGE Full Test
READING PASSAGE Full Test
One consequence of the shifting expectations of work has been called the Great
Resignation: a widespread trend of workers leaving their jobs. A 2021 survey by Microsoft
of IT workers and others around the world found that forty-one percent of employees were
thinking of quitting in the coming year. During the pandemic, Americans quit their jobs in
extraordinary numbers: a record four million resigned in April 2021 alone. In Germany,
Europe’s biggest economy, more than a third of companies reported difficulty filling
vacancies. Furthermore, a record-setting 10.9 million jobs were still open the following
July in the US. These statistics may have influenced the Microsoft study, which concluded
that “leaders are out of touch with employees and need a wakeup call.”
Employers in some industries needed that wake-up call more than others, as resignations
have not been evenly distributed. Restaurants, for example, were hit particularly hard
during the pandemic, with one million fewer workers in the field in 2021 than in 2020. US
fast-food places also struggled to find workers, despite some businesses offering up to
$19 an hour (over twice the US minimum wage) as well as bonuses. This statistic shows
that money isn’t the only thing motivating employment. After twenty-six years in the
industry, Jeremy Golembiewski quit his position as a general manager at a chain
restaurant in California. When COVID-19 hit, he was furloughed, giving him time to spend
with his family. This was something he greatly missed when restrictions were eventually
lifted, and he had to return to an understaffed restaurant that pushed him to work nearly
a sixty-hour workweek. Jeremy decided his family took precedence and that he should
seek a less demanding job in a different field—even if that job came at a lower position in
the company hierarchy.
Another industry that has seen a mass exodus of employees is the technology industry.
One survey found that a stunning forty percent of US tech workers have quit or intend to
quit by 2022. The COVID lockdowns demonstrated that many jobs could be done
remotely, allowing flexibility for workers. As a result, many workers lost the will to go back
and work in the office. Microsoft’s survey determined that more than half of employees
felt overworked and that one in five felt their bosses didn’t care about their work-life
READING 3 MIDTERM TEST
balance. Several American companies have displayed some adaptability to this situation:
Twitter, Google, and Facebook, for example, all announced that pandemic-related work-
from-home arrangements for many employees will now be permanent.
As for the economy, the Great Resignation could benefit employees because wages tend
to rise when companies compete for talent. Yet there’s no reason to assume this will last
since the Great Resignation has been significantly enabled by temporary government
programs instituted because of the pandemic: extra unemployment benefits, cash
assistance programs, moratoriums on evictions and student loans, and more. It remains
to be seen whether the pace of resignations will sustain itself once COVID-related
government help is no longer available and job-seekers are forced to become less picky.
The alterations in attitudes toward work spurred by the pandemic are potentially long-
lasting, both in the US and globally. For many, the Great Resignation was about rejecting
“workism”—the idea that your job is the core of your identity and life’s purpose. Victoria
Short, CEO of a British job recruitment agency, told the Guardian in 2021 that from now
on, people are going to be less likely to accept excessive workloads or stay in jobs they
hate. “The pandemic has changed how some people think about life, work, and what they
want out of both. It’s made people step back and rethink their lives. COVID has reminded
them that life is too short.”
READING 3 MIDTERM TEST
8. In paragraph 3, why does the author use the phrase “a wake-up call”?
A. To suggest that some employees have lazy habits.
B. To say that workers need to accept their workload.
C. To imply that companies should listen to worker demands.
D. To show that the pandemic made workers aware of problems.
9. In line 42, the word this refers to _________________.
A. The rise in wages
B. A growing economy
C. Improved job benefits
D. Companies’ desire for talent
10. What can be inferred from paragraph 3?
A. Americans are now less interested in eating out.
B. Jeremy’s restaurant chain went out of business.
C. Sixty-hour workweeks are now considered normal.
D. Jeremy is willing to accept a reduction in his income.
READING 3 MIDTERM TEST
READING PASSAGE 2
STANDARDIZED TESTS FOR A NON-STANDARDIZED WORLD?
As we might recognize them, the earliest form of exams took place at universities in the
eleventh century. They were oral and involved small groups of elite candidates who were
expected to debate with lecturers on topics from philosophy to the natural sciences. It
was a system fraught with variables and inequities. It wasn’t until the nineteenth century
that written exams became common. Since then, many would claim, we have gradually
amended examinations to be fairer, more effective, and more relevant—but are the exams
students take today really fit for purpose?
Today, exams such as the Scholastic Assessment Test (SATs)—which secure entry to
universities—are nationally standardized. This standardization means that, unlike those
early exams, all students take identical tests to make a “fair” comparison of many
students. However, according to critics, tests such as the SAT inherently favor the
wealthy. The COVID-19 pandemic exacerbated this issue. With schools closed,
economically disadvantaged students were less likely to encounter crucial alternatives
like individual tutelage or homeschooling. Many also found that their tests were repeatedly
rescheduled or canceled at the last minute; others could not take the exams at all. Such
was the prevalence of the problem that 1,600 US universities, including every Ivy League
college, signed up to the “test-optional” movement. This meant that in 2021 at least,
colleges would not necessarily take SAT scores into account when considering
applications. Initially, the test-optional movement was an emergency response to the
pandemic. Still, like many such measures, we may find that its flexibility makes it a fairer
option in the future.
Even if we ignore the socio-economic issues behind standardized exams, the question
remains: are they really an effective way to find out how much a student knows? Some
students are emotionally and psychologically better suited than others to the pressure of
providing evidence—on one given day—of several years of education. This fact was
made clear when the situation surrounding the COVID-19 pandemic shone a spotlight on
the mental impact of tests. As schools closed and college application deadlines
approached, stress among students skyrocketed. Many feared that the lost months would
affect the rest of their lives. In the UK, a 2020 study by Rahim Hirji for Quizlet found that
of 1,400 teenagers, eighty-five percent were stressed or anxious. In response, the UK
and other governments instructed teachers to rely on their detailed first-hand knowledge
of students’ long-term performance to decide grades.
Even under normal conditions, some argue that the heavy reliance on examinations leads
to ineffective education, rote learning, and “teaching to the test.” Critics like Dr. Robin
Harwick argue that current students are not given a holistic education but are simply
taught to score highly. “Standardized tests are only useful for measuring standardized
minds,” says Dr. Harwick. “However, humans are not standardized, nor do we want them
READING 3 MIDTERM TEST
to be.” Another argument against standardized tests is that the prevalence of multiple-
choice exams—which computers can quickly grade—pushes students to learn how to
deduce logical answers rather than use independent critical thinking. In addition, schools
seem to benefit most from teaching students to succeed at quantifiable tests since school
ranking tables are now widely published and referred to by parents. Their prominence
leads schools and teachers to feel pressure to focus on test results, perhaps at the
expense of other, less tangible aspects of educational development.
Of course, supporters of standardized tests would point to the transferable skills required
in studying for them, including organization, preparation, and self-discipline. In fact, the
very stress that some cite as a problem could also be seen as a useful experience. We
will all one day be required to retrieve and apply information under pressure in our adult
lives. Thus, exams serve a clear purpose as one part of an educational strategy, and even
critics wouldn’t call for their total elimination. However, many would argue that too much
weight is placed on them and that it would be fairer and less stressful to think of exams
as just one component in a wider grading system of long-term coursework, which includes
research, presentations, and essays.
Early adoption is a bad investment, to put it bluntly. First, the earliest versions of products
are not only expensive, they are the most expensive that those devices will ever be.
Companies presumably charge more to recover the cost of development and production
as fast as possible, and they know that there are serious tech-lovers who will pay a great
deal to be early adopters. Once the revenues from early adopters’ purchases are safely
in their hands, they can cut the price and shift to the next marketing phase: selling the
product to everyone else. This tactic is why the cost of the original iPhone dropped about
$200 only eight months after its release. Prices of gadgets most often fall shortly after
release, and they are likely to continue falling. For instance, many new TV models drop
significantly in price as little as ten days after hitting the market. Furthermore, electronics
rapidly depreciate because they become obsolete so quickly; the resale price of a used
cell phone or laptop can drop by fifty percent within just a few months.
Those who are first to leap into a new technology not only risk wasting money, they might
also lose time on something that will never catch on with the general public. In 2006, two
competing options for high-definition video entered the market: HD DVDs
and Blu-ray discs. Both seemed promising, and both required special devices called
players, costing hundreds of US dollars. Cautious consumers decided to stay neutral,
realizing that one or the other would probably end up dominating, and refrained from
buying either product. But a few eager consumers took a gamble, and those who
regrettably bought an HD DVD player quickly found themselves stuck with a virtually
worthless machine. In the struggle for high-definition video dominance, Blu-ray was much
more technologically advanced than HD DVD and could store up to seven times the
amount of information. Sales dropped steadily for HD DVD players, and by early 2008,
support for the product was discontinued entirely. Many new products face a similar fate;
early adopters are then stuck with pricey gadgets that do nothing but sit on their shelves
collecting dust.
Even worse, your new device might have functions that you don’t know about and would
likely not approve of if you did. In 2013, Amazon Echo introduced the world to a digital
READING 3 MIDTERM TEST
assistant named Alexa, who is supposed to become active only when you say “her” name.
However, voice-recognition technology is still imperfect. These devices often activate
without users’ permission and record what they hear (though this fact is not acknowledged
in the packaging or marketing). The privacy implications remain unclear but are causing
tension between developers and consumers. As tech reporter Adam Estes told the
Guardian in 2019 in a discussion of digital assistants, “I hate to be dramatic, but I don’t
think we’re ever going to feel safe from their data-collection practices.”
Early adopters do something most others are reluctant to do: buy overpriced technology
before it has matured for the dubious rewards of being the first and enjoying a short-term
increase in status. These trailblazers help the rest of us through their willingness to spend
the extra money and work out the problems with a new product. So if you know any early
adopters, thank them, and then congratulate yourself on not being one of them.