Curricular analytics: one project and many questions

What are curricular analytics and what can they add to higher education?

Yesterday the Future Trends Forum met with Gregory L. Heileman, associate vice provost for academic administration and professor of electrical and computer engineering at the University of Arizona. He’s also project lead for the Curricular Analytics effort, and that’s what we explored for an hour.

CA analyzes the different routes undergraduate students can take through university curricula, measuring relative complexity in order to better shape advising, majors, and course catalogs. It generates visualizations of student pathways, including metrics, such as:

Forum_Heileman_instituition 2

Sample visualization on the right. That’s Dr. Heileman on bottom left.

You can find out more on the project site, including uploading your own curricular data for processing and visualization.  There is at least one scholarly paper breaking down the approach and data.  There is also a Github site with plenty of files to download.

Naturally we recorded the whole session:


I would like to say a few things about the session, which went differently than many Forum Thursdays. There was a good amount of presentation, as Greg needed to show people the various visualizations in order to explain what CA does. That’s a break from tradition, as the Future Trends Forum is normally focused entirely on face to face discussion, but it worked well here. Participants had plenty of perceptive questions and comments, which Dr. Heileman handled very well.  Maria Anderson, founding CEO of curricular analytics service Coursetune and excellent Forum guest, weighed in. Overall it felt like a rich workshop or tutorial, and might point the way towards more Forum events along these lines.

I mentioned plenty of questions and comments and meant it. Yesterday was one of those sessions where we ran out of time before getting to all of them, so I’ll follow my usual practice of copying them here, lightly edited for anonymity and typos, and in the chronological order they appeared during the hour-long session:

Students often fail to see the larger point to their educational efforts. How can we leverage your project to encourage them to contextualize their education? What about opportunities for interdisciplinarity?

I might be jumping the gun, but how does bringing these insights forward allow institutions to make substantive curricular or policy shifts.

Students who go to elite schools have already learned how to succeed in academia no matter what you do to them. It’s not really a fair comparison to say that elite schools give students more freedom.

Has your work unearthed common reasons for curricular complexity? Governance? Faulty culture? Etc.?

Forum Heileman quality vs complexity

Do you think the downward trend in high school graduates will push institutions to streamline curricula?

How might we move from a culture of “weeding out” students at the intro level & instead open the door for more opportunities for students more investment in completing a degree/getting an education?

Have career outcomes or paths of these graduates been mapped to the curricular complexity of their alma mater? And is there any relationship?

Did the restructuring and pathway of using an “Engineering 101” course also benefit students who might have had previous experience in math or engineering? Diversity and reinforcement sounds promising

How is Instructional Complexity calculated? How is it different than Structural Complexity?

I hope participants can use comments on this post, or comments elsewhere, to continue the discussion.

One more link to share: in response to questions about campuses running into restrictions imposed by disciplinary associations, professor Heileman referenced a paper he co-authored with the very perky title “ABET Won’t Let Us Do That!”

This is the first time the Forum has focused so clearly on data analytics.  It won’t be the last.

Happy to hear your reactions in comments below!

Liked it? Take a second to support Bryan Alexander on Patreon!
Become a patron at Patreon!
This entry was posted in education and technology, Future Trends Forum. Bookmark the permalink.

5 Responses to Curricular analytics: one project and many questions

  1. Glen McGhee, FHEAP says:

    I wanted to pick up from “ABET Won’t Let Us Do That!” the article by two peer reviewers for the engineering accreditor, ABET.

    Buried in their article is an important admission: “institutional accreditation” (NOT program accreditation) “is required for eligibility to distribute federal financial aid” and student loan availability under Title IV. Following the money, it brings students, parents, teachers, colleges, that is, it brings everyone, to Title IV accreditation. ABET is not a Title IV (34 CFR 602) accreditor.

    But “many academic colleges or programs also seek specialized accreditation in their discipline. ABET provides specialized accreditation in the area of engineering. Unlike institutional accreditation, specialized accreditation generally does not impact financial aid; however, it signals to potential employers that the graduates of a program are of a certain quality, and it may be used as a precondition for

    This is a dispositive distinction. Everything else is, well, professional buffery and even, some would say, puffery. For engineers and their schools.

    “ABET Won’t Let Us Do That!” never explains the political ecology of the ABET [some of its history is covered in Randall Collins, The Credential Society (1979/2019)], and so we never get a sense of what’s *really* going on here (in the article by two — shall we call them — disgruntled peer reviewers?). Grievances recounted in the article easily apply to accreditation in general — whose sole purpose is to maintain the status quo.

    As we’ve seem with the ACICS (and even the ABA in 2011), sectors that are unable to maintain equilibrium find themselves at risk. Impression management, loose coupling, and self-regulation by hand-picked reviewers — clearly, it’s an insiders game.

    Consequently, articles like “ABET Won’t Let Us Do That!” run the risk of black-listing the authors (yes, reviewers can be, and are, black-listed!); or, if the slight is intentional, marks the beginning of a new venture and the ending of an old one, perhaps in reverse order.

    In the end, we are met with a simple truth — engineering education is what engineers say it is. Exceeding 120 credit hours? Fine — offer a Masters degree, then.
    And there’s no mention of what the mathematics requirement is. One year of calculus (without differential equations?) doesn’t sound right. That won’t even get you to Maxwell’s Equations, right?

    But this just brings us back to where we started, doesn’t it? Two engineering peer reviewers sitting on a park bench ….

    • Glen McGhee FHEAP says:

      There was an interesting study in 2017 by Jessica M. Sanders that pointed out problems with the peer review process itself, the backbone of accreditation integrity and reliability. The subtitle is ” Opinions from the unheard voice of volunteer peer evaluators.”

      Surprisingly, the top two issues encountered by peer reviewers were personal bias (38.3%, n= 360) and conflicts of interest (33.8%, n= 327). The results combined the “Very Negative” and “Negative” survey responses in regard to their accreditation review experiences. “These two factors identified as being most negative by participants align with research on peer review in other fields recognizing that peer review is exposed to bias and conflicts of interest.”

      Other negative factors experienced by peer reviewers, in descending order, behind personal bias and conflicts of interest, were: pressure regarding impact of evaluation, number of days allotted for visit, and scope of review/complexity of visit.

      “63% of participants (n = 583) agreed that additional peer evaluator training/mentoring would improve the effectiveness of the peer review process. This opinion of peer evaluators reinforced the literature that contends peer review training programs may be lacking quality or non-existent.”

      Only 81% of those surveyed agreed that the results of peer review are a reliable indicator of the quality of an institution, and only “62% agree that volunteer peer evaluators are more appropriate than professionally trained evaluators to make an objective judgment about an institution’s compliance with accreditation standards”. Yikes!

      For reasons like this in other settings, it looks like peer review is on its way out. Not too long ago, CPAs lost the ability to self-regulate their audits of publicly traded companies.

      • Bryan Alexander says:

        That’s quite a critique of accreditation by peer review. Can I raise this during our next Forum session with an accreditor?

        • Glen McGhee, FHEAP says:

          Yes, Jessica Sanders’ breakthrough study is an eye-opener — one of a kind! No one else would dare to look behind the curtain of peer-review. Not even someone like Paul Gaston.

          Future Trends Forum has already featured an accreditor that put a happy face on Covid-education quality mark-downs allowed by US DOE, especially in regard to online. He promised a Jan 2021 roll-back of across-the-board waivers, but that hasn’t happened yet.

          For more than 15 years I’ve asked accreditors questions. Accreditors are masters of impression management, highly adept at brushing questions aside. That’s why they earn such big fat salaries.

    • Bryan Alexander says:

      Excellent dive into that article, Glen.
      Important to distinguish between the professional and federal requirements.

Leave a Reply

Your email address will not be published. Required fields are marked *