System Conflicts Between Game Development and Traditional Academia

As someone who has been in one undergraduate and two masters programs focusing on game development, I have had the opportunity to observe diverse curricula. These observations have brought to light challenges that face games education, especially in the areas of research standards and grading procedures while accommodating a fail-fast methodology.

To be clear, this is not a talk about educational games or games for use in classrooms or as educational tools.

Games education is faced with the difficult (if not impossible) task of standardizing innovation. Moreover they are asking students to create “new” experiences before they have fully grasped what the current standards are. This is not necessarily negative; without a firm grasp on current game design/development, innovation can become a byproduct of ignorance as lateral thinking becomes a natural approach to design problems that arise. The real problem exists when you attempt to put a time constraint and a grade on such efforts.

It has been argued that grades represent payment and deadlines exist out in the real game dev world, but stagnation also exists out there. Academia should take advantage of the freedom from those restraints instead of pushing to “prepare” students to work in real world environments. The studios that break from the pack of sameness that pervades the industry are those that create outside of these constraints. That Game Company went far beyond their deadline and budget when making Journey,  and there’s the classic example of Blizzard’s “it will be released when it’s ready”, beyond that there are so many indie developers with tight budgets but also the passion and bravery to prioritize the game. This is where the academic model of “real game development” falters critically.

When a school project goes beyond a deadline it is simply cut off. You could argue that if the student was passionate enough, they could continue the project outside of class. However, if the grade is supposed to represent payment, then it would make far more sense to say they don’t get a grade until they say it’s finished, whether that be a month or a year later. Current academic models are not nearly flexible enough to allow this, but they need to be. I would further argue that long term incentives would include class prerequisites. You could take as long as you like on a project, but you won’t get to take the next class in the series until you’ve submitted the assignment. By doing so you’d create a system based on an academic “risk tolerance” similar to the financial “risk tolerance” that often accompanies small teams in the industry.

Risk tolerance being the amount of time that you can operate without a stream of revenue. Because students are paying for their time, their academic risk tolerance would be the amount of time they can spend in school before they are given a grade (which is effectively a small portion of their degree). If a student wanted to complete the program in a short period of time, they would have a low risk tolerance and need to make games quickly. This may result in lower grades (to keep with the analogy, less money). Whereas students who were in a situation where they could spend more time on each project would be able to take the time to make the games they want.

Alternatively, if you realize that a project is not going to turn out well you would cancel it in the real world. In academia it is often better to just finish what you have and take the B- then restart part of the way through the term. This is perhaps the largest missed opportunity academic settings have to adapt their structures.

To take the analogy further- grades could even accumulate (like money) instead of averaging — scholarships could be treated like contracts where more specific deadlines are negotiated. Perhaps your grade decays over time (representing cost of operating) and/or projects require a buy-in. Other assignments and learning opportunities would be treated as odd-jobs or moonlighting or part-time or freelance work.

The ETC has perhaps struck the best balance between these, by allowing students to shape both their own measures of success which they will be graded on, and to a lesser degree the measurements for the success of their team project. Though they are not without flaw.

With the overwhelming number of game programs coming into academic maturity, I look forward to seeing new and innovative ways that university settings adapt to foster game development.

3 Comments

  1. Abhishek Singh

    This was a pretty fascinating write up and definitely something which is unique to this field. This is my first program in a creative field so I do not have as much experience or basis for reference as you do. And I was also confused if you were critiquing a particular school or the overall academic structure. For undergraduate and masters program, I think the biggest challenge is the logistical requirement. There needs to be clear progress and a visible scale for both the student as well as the institution. Your suggestion of extending the time frames (long term) would mean it changes to something like a research project, a doctorate maybe? I cannot wrap my head around how that would work. I do agree with the fact that extremely strict deadlines diminish the creativity and/or quality of a project.

    The ETC is definitely in an interesting space. The ‘on-the-job’ style of learning tries to mirror the work environment and the setup of the teams also brings it close to the actual experience. I still feel that the exposure to different activities like pitch, interviews and overall networking with the industry are the program’s strong points. But this changes its personality to more of a finishing school. The nature of the projects themselves leads to an emphasis on the product and personal learning takes a back seat. It is somewhat difficult to pursue a topic you are specifically interested in and dive deep into it. It wouldn’t be amiss for a game program to retain some of it’s ‘school’ aspect and provide a learning ground rather than constantly recreate the future environment.

  2. Ben Gansky

    Interesting thought experiment. A couple thoughts in response:

    1. You seem to be thinking about the ‘real-world’ environment for games as mostly market-driven. Which it is! Mostly. But there are also other contexts in which games are starting to take hold, like the fine art world, education, social engagement. Those other contexts have different conditions and imperatives than the world of commercial gaming. Might be interesting to think about how to map those kinds of conditions onto the academic game experience.

    2. The fact that curricula are created by people who belong to an older generation cannot be forgotten. Sometimes these people are really fucking smart. (Hi, Jesse!) Sometimes they’re less so. But even the most prescient among them are constrained by the fact that they grew up in a different time, with a much different context of games. I had a fascinating conversation with a writer for EA who told me that it’s well understood in his studio and beyond that the junior members of the team are by and large more talented than their seniors–because for them modern console games have been a fact of life since they were pre-adolescent. We’ve been raised on these games, they are native to us, and that can be said for very few if any of our ‘seniors’–the people designing the curricula by which we are trained. I think it’s important to remember that. Now, there are certain assumptions that young people might make that older people might not–and that’s an advantage sometimes for people of an older generation; but sometimes it’s an issue. All this is to say that in addition to the structural factors of the academic context, there’s also the context of a generation gap.

  3. Wow you are one of the few who has touched this topic, and have done a good job covering many of the points I strongly feel about. I have undergone an undergraduate and graduate program that is not related to the entertainment industry, so I might not have strong insight into these problems but I can easily relate to a few from my experiences. I like your take on standardizing innovation. It sure is compelling to think about the possibilities when fixing this underlying problem that everyone just decided to ignore. Super elated to see Blizzard mentioned here! Coming back to the article, I have to agree with you on the failures of the academic model we all face. I strongly believe that there has to be a difference between education model and the industry approach, firstly because when you are learning there have to be barriers but ones that guide you in the right direction with minimal casualties in terms of restrictions on progress. I feel there is a lot that has not been explored in balancing academia with game development. Overall a really well written to the point write up touching on some really important topics.

Leave a Reply

Your email address will not be published.