Credit where credit is due
When someone mentions to a prospective employer-or anyone they’re trying to impress-that they have a college degree, they are usually trying to make the point that they have a certain level of competence in their chosen field. Most likely, they aren’t trying to brag about the inordinate amount of money and time they’ve spent to reach that level. So if the whole money-and-time thing were toned down just a little, nobody would mind…right?
The academic community has been doing a lot of soul-seeking lately regarding the credit-hour system that has been the norm at most traditional colleges for so long, and it’s about time. Learning methods are branching out in all directions-it’s been commonplace for years now to be able to take whole courses online without ever setting foot in a classroom-so judging how much a student has learned the material by how long they’ve spent learning it is starting to seem more incongruous than ever. The more time you spend learning, the more you pay for it, and the more likely it is you’ll still be around when more tuition hikes (inevitably) arrive. It would open a lot of doors for those with the most motivation-as opposed to those with the most money-if competence weren’t determined based on a fixed timeline.
If you can cram your head with sixteen credits worth of knowledge in less than nine weeks, why should the system artificially hold you back? It reminds me of high school, when the curriculum was geared toward making sure that the more…uh…methodical learners weren’t overwhelmed by it all. After all, you wouldn’t want them to drag down the average on the standardized competency testing that’s become so integral in determining how much money a school gets, in the era of No Child Left Behind. Plenty of schools can look forward to being left behind when it comes to funding, but at least no child will be left behind in terms of learning speed. But that’s another story.
So everything continues to revolve around the credit-hour. It determines the pay rates for the faculty and the amount of time in which a student can earn a degree. It also affects how much state funding a school receives. Currently, if you want to make it through school without this kind of chronological constraint (say that five times fast), there are only a few colleges in the country that offer alternative programs. When credits are awarded based on teacher assessments and testing that doesn’t follow the standard midterm-and-final timetable, students have much more control over the structure of their own education. And why shouldn’t they? They’re paying for it.
Personally, I’m not speaking from the elevated vantage point of one of those overachieving 퀌_ber-students who could take advantage of such an overhaul of the system-it’s taken me a senior’s length of time to earn a junior’s worth of credits thanks to, ahem, other priorities getting in the way. But if someone does have the drive and the ability, then a college’s credit-earning structure shouldn’t hold them back. That would be equivalent to having to take a ton of University Studies courses not relevant to your major in order to graduate. But once again, that’s another story.
Of course, it’s not practical to overhaul such a deeply ingrained system overnight. And many colleges would probably like to avoid the stigma of sounding like one of those flagrantly sketchy “earn your degree in two weeks” infomercials that lurk on late-night TV. But in order to compete with the flexibility of fully online universities, fly-by-night or not, they’re going to have to adapt sooner or later to stay in the game. Then maybe we’ll see a new level of independence when it comes to planning the timeline of our education.