AWP Disputes Methodology of MFA Rankings

December 1, 2009

AWP disputes the results of a recent survey published in Poets & Writers. The article, entitled “The Top 50 MFA Programs,” asked 500 applicants to MFA programs to judge 150 residential MFA Programs in the United States. AWP responded that the sample used in the survey was biased, unrepresentative, and unqualified to assess the educational quality of MFA programs. The survey did not include low-residency programs or international programs.

In a letter to Poets & Writers, Acting Executive Director Matt Burriesci wrote, “The tutelage of an artist is a complex and serious business, and it cannot be reduced to a single spreadsheet column sorted in descending order.”

AWP’s full response:

AWP's Response to a Recent Article Ranking MFA Programs

Mary Gannon
Editorial Director
Poets & Writers
90 Broad St, Suite 2100
New York, NY 10004

cc:  Elliot Figman, Executive Director

October 26, 2009

 

Dear Ms. Gannon:

I have the greatest respect for Poets & Writers, which is why I was so disappointed to see Seth Abramson’s opinion survey of MFA programs in your November/December issue.  The tutelage of an artist is a complex and serious business, and it cannot be reduced to a single spreadsheet column sorted in descending order.  Abramson himself seems to concede this point before proceeding.  But even if one could squeeze this universe into one question, from a statistical standpoint, Abramson’s methodology would still be flawed.

The cover headline referring to the article proclaims, “The Top 50 MFA Programs,” which itself is incorrect.  Abramson’s piece is not a ranking or comparison of all MFA programs, but of residential programs, and only those residential programs in the United States.  Missing in this analysis altogether are the dozens of low-residency and international programs.

The sample audience Abramson used in his survey consists of 500 self-identified applicants to MFA programs. Applicants to MFA programs are only one of the key stakeholders in the success of MFA programs.  Other stakeholders include faculty, administrators, current students, the professional association that represents them (AWP), and alumni.  Abramson is correct to point out the problems with previous ranking efforts, but he falls victim to the same sins of omission and reduction committed in those attempts.  This particular sampled audience, while interesting, does have its own set of biases in assessing MFA programs.  Some may prefer to be admitted to a high-profile program, while others may be unaware of their full range of options. 

It’s also an unrepresentative sample. There are more than 13,000 applicants to MFA programs each year.  If we ask less than 4% of them to tell us their opinions about MFA programs, we will arrive at what Abramson produces:  the opinion of this 4% of applicants of 75% of MFA programs.  If you drill down, more questions are raised about the data. No demographic information appears to have been collected.  We don’t know, for example, if there’s an appropriate geographical, gender, ethnic, and age variety in the sample.  These factors do make a significant difference in the preference of applicants.

Perhaps most importantly, the assumption of the survey itself must be questioned, because the audience sampled is not qualified to make a comprehensive, qualitative judgment about what they’re being asked to rank.  The perceptions of applicants may be part of a larger decision-matrix, but those isolated perceptions cannot be the dominant criteria employed in a qualitative assessment.  Presumably, most of the applicants Abramson surveyed would be primarily in their early to mid-twenties, and most would lack a graduate degree.  We don’t know, because no demographic information seems to have been collected, but this is a reasonable assumption.  This narrow demographic cannot definitively assess the educational quality of graduate writing programs in the United States. The sampled audience includes no graduates, no faculty, and no program directors.  It does not include pedagogical experts, professional writers, or individuals with accrediting experience.  It is not the opinion of the professional association that represents them.  In short, the survey doesn’t provide the full picture, only a fragment of it. 

Unmeasured in the opinion survey are the following criteria, which AWP feels are salient in one’s personal choice of MFA programs:

  1. Faculty;
  2. The number of matriculated students who go on to publish significant work;
  3. The complementary assets and infrastructure, such as access to editorial experience on magazines or university presses; and
  4. Program philosophies.

Choosing an MFA program is indeed a very big decision. For me personally, it meant leaving a promising career and my hometown of Chicago.  I based my decision largely on one factor:  who was teaching where.  It just so happened that the residency program at George Mason University was right for me.  AWP’s presence at George Mason and the university’s access to Washington DC also weighed significantly in my decision.  But these are not necessarily the factors used by others in making their decisions.

Many prospective students may prefer the structure of the low-residency model.  Cost, duration, program size, teaching opportunities, financial aid, and the balance of workshop and literary scholarship are also important.  Many may choose to attend programs based on that program’s particular philosophy.  At Seattle Pacific University, there is an excellent low-residency program designed around the Judeo Christian philosophy.  AWP recently performed an assessment of this program, and in interviews with the students we discovered that many of them turned down slots at other schools because of this program’s unique philosophy. 

I regularly refer applicants to AWP’s Guide to Writing Programs, which is free to all, and which was painstakingly assembled over several years.  The Guide to Writing Programs (guide.awpwriter.org) contains most of the very information that Abramson claims is “not forthcoming” from MFA programs.  The statistically significant information culled from a major, comprehensive survey of these programs––including admissions statistics, program size, tuition information, and many other important facts and figures––is available to AWP’s members.  I’d also refer applicants to AWP’s Hallmarks for a Successful MFA Program in Creative Writing, which were assembled over years with the input of all the stakeholders in MFA programs:  faculty, students, administrators, alumni, and the professional organization that represents them.  As a result, the hallmarks are indeed complex––but so is assessing the quality of a higher-education program providing its students with an advanced degree. 

AWP serves 34,000 writers around the world, and our constituency includes thousands of people who actually have earned their MFA degree, teach in MFA programs, or study in MFA programs.  I believe they can constructively contribute to any intelligent and informed discussion about MFA programs.

 

Sincerely,

Matt Burriesci
Acting Executive Director
The Association of Writers & Writing Programs (AWP)

Next Story:
Herta Müller Takes Home A Nobel
December 1, 2009

No Comments