Saturday, January 25, 2014

The final exam


Might be a good idea. The only exam I had for undergraduate work was an English proficiency exam.

"The New College Exam: A Test to Graduate"

Schools are increasingly making students sit for one last exam before they get that diploma

by

Jon Marcus

January 25th, 2014

Time

On weekend mornings all this winter, anxious high school juniors and seniors will file into school cafeterias to sweat through the SAT, ACT, and similar college entrance examinations, stern-looking proctors hovering over them. Such tests are among the long-established requirements for getting into college. But something new is afoot: Increasingly, students have to take a test to get out.

The advent of the college exit test is being driven largely by parents, lawmakers and others intent on making sure they’re getting their money’s worth from colleges and universities—and by employers who complain that graduates arrive surprisingly ill-prepared.

“There is a groundswell from the public about whether a college degree is worth what people are paying for it,” said Stephanie Davidson, vice chancellor for academic affairs at the University System of Ohio. “People are asking for tangible demonstrations of what students know.”

Ohio this year started testing candidates for education degrees before they graduate. The Wisconsin Technical College System requires its graduating students to take tests, or to submit portfolios, research papers or other proof of what they know. And all undergraduates at the University of Central Missouri have to pass a test before they are allowed to graduate. Such activity is up “significantly,” according to a new report from the National Institute for Learning Outcomes Assessment.

The trend unmasks a flabbergasting reality: that those expensive university degrees may not actually prove a graduate is sufficiently educated to compete in the workforce. And it advances the seemingly obvious proposition that students should be made to show they are before they get one.

“Isn’t it amazing that the newest and most brilliant idea out there is that students should achieve particular skills and prove it?” Marsha Watson, president of the Association for the Assessment of Learning in Higher Education, asked wryly. “Wow.”

Faculty grades fail to do this, advocates for testing say.

Forty-three percent of grades given out by college faculty are A’s, according to research published by Teachers College at Columbia University. Yet one-half of students about to graduate from four-year colleges and 75 percent at two-year schools fall below the “proficient” level of literacy, according to a survey by the American Institutes for Research. That means they’re unable to complete such real-world tasks as comparing credit-card offers with different interest rates, or summarizing the two sides of an argument.

“It’s really bad news, and it’s gotten worse,” said Margaret Miller, a professor at the University of Virginia’s Center for the Study of Higher Education and an expert on assessing learning.

A separate survey of employers by an association of universities found that more than 40 percent don’t think colleges are teaching students what they need to know to succeed. One-third say graduates aren’t qualified for even entry-level work.

“I had a syllabus, and I had what the outcomes of the course would be, and those included critical thinking,” said Julie Carnahan, who taught public policy and organizational management before she took her current job working on assessment at the State Higher Education Executive Officers Association. “In retrospect, now that I know so much more, I didn’t ever test students to determine if they could actually demonstrate critical thinking.”

So most students pass their courses, said Watson, the former director of assessment at the University of Kentucky, and “are given degrees because they accumulate credits. Each credit hour is assumed to be a metric of learning, and it’s not. The only thing it’s a metric of is that your butt was in a seat.”

Many of the exit tests now being tried are similar to the kinds of practical licensing exams that candidates for nursing degrees have to take, which require them to prove in the real world that they can apply what they’ve learned in a classroom. “Nobody wants a nurse who’s only taken written tests,” Watson said. “What you want is a nurse who has some experience before they jab a needle into your arm.”

In Ohio, for example, candidates for education degrees have to write a lesson plan and make videos of themselves teaching.

But introducing new ways of measuring what students learn is time-consuming, complicated and expensive—not to mention resisted by universities fearful the results will be used to compare them with competitors. And the ones that have the most at stake are universities and colleges already assumed to be among the best.

“They hate it. They hate it,” Miller said. “They already have the reputation for educating students well, so they can only lose.”
The more selective an institution’s admission standards, the National Institute for Learning Outcomes Assessment report found, the less likely it tests what its students know.

“They have everything to lose and nothing to gain,” Watson said.

That’s one reason why the move to establish exit tests is starting slowly, and the standards remain comparatively low. The cutoff score in the University of Central Missouri exit exam is below the lowest level of proficiency and exemptions are made for students with learning disabilities or whose native language is something other than English. No one there in the last three years, or ever in Wisconsin, has been blocked from graduating because of a poor exit-test score.

In other places, students are being tested not to determine whether or not they should be allowed to graduate, but to check for strengths and weaknesses within specific majors or campuses. Some colleges and states, including Ohio, let students and their families see the results. Others don’t, or make them hard to find. In all, only about a third of colleges and universities make assessment results public, according to the National Institute for Learning Outcomes Assessment report.

More and more states, including Missouri, Pennsylvania, and South Carolina, have approved using student exit-test results to determine how institutions are doing as one of the measures on which funding is based. Almost 50 colleges and universities in nine states are trying to develop a way to test students, before they graduate, in written communication and quantitative literacy. So far these metrics are only used for evaluating their own programs, not to judge individual students or decide whether they’ve earned degrees.

“We want to be very careful,” said Carnahan, who is coordinating the project. “We don’t want this process to end up where states are being ranked. What we hope to do in the short term is to only look at the data by sector across states and not identify institutions. That’s really critical until we can be sure that this paradigm we’re looking at is valid.”

Rather than wait for that to happen, some students and employers are taking things into their own hands.

“We can see that in the portfolios that are coming, where it’s not just, ‘Here’s my GPA,’ but, ‘Here’s my work as well, and what I’ve learned from my internships and classes,’” said Angela Taylor, director of quality enhancement at Texas Christian University. And some employers are now testing job applicants themselves to see if they know what their college degrees say they do.

All of the 42,000 students at Western Governors University, an online university founded by the governors of 19 states, have to prove what they know, by getting at least a B on an assessment test, not only before receiving a degree, but before completing any course.

“We want to be sure that they leave with more than they started with,” said Joan Mitchell, the university’s spokeswoman.

“At some point in our world,” she said, “we’re going to have to look at, ‘Do you know it? Have you mastered it?’ It’s a whole cultural shift.”

No comments: