Major Updates

  • Distance Education, Online Student Reporting, State Authorization and Other Topics Announced for Early 2024 Negotiated Rulemaking
    The US Department of Education has announced a negotiated rulemaking session covering topics important to the UPCEA community including distance education, reporting for fully online students, as well as state authorization, among other topics. The announced sessions will take place early next year, occurring virtually on January 8-11, February 5-8, and March 4-7. The Department is seeking nominations for different stakeholder positions by December 13 of this year.

    The topics which will be discussed within these negotiated rulemaking sessions include:

    • Recognition of accrediting agencies and related issues
    • Institutional eligibility, including state authorization
    • The definition of distance education as it pertains to clock hour programs and reporting for students who enroll primarily online
    • Return of Title IV of Higher Education Act of 1965 funds
    • Cash management to address disbursement of student funds
    • Eligibility of TRIO Programs (which will also be included as part of subcommittee work on these issues)

Notably absent is the discussion on Third Party Servicers, which the Department has mentioned may be included in future negotiated rulemaking sessions. The Department indicated they will be issuing new guidance on this topic in early 2024.

Read the Department of Education’s Press Release.
See full details in the Federal Register.

 

  • Will the Feds Strip Colleges’ Funds Over Anti-Jewish, Muslim Bias? (Inside Higher Ed)
    “Amid the protests and incidents that have rocked college campuses since the start of the Israel-Hamas war, a cry has gone up from conservative politicians and groups for the federal government to pull federal funding from colleges and universities if they fail to quell antisemitism and protect their Jewish students.

    Republican presidential candidates and members of Congress have been especially vocal in calling for such punishment. ‘We’re not in the business of using taxpayer dollars to provide and nourish hate,’ said Utah Representative Burgess Owens, a Republican who chairs the House subcommittee on higher education, at the end of a hearing last week on campus antisemitism.

    ‘That is not the American way.’But stripping colleges of their access to federal funds would be an unprecedented step for the Education Department to take. It’s possible under the law, legal experts say, but it would only happen after a long and complex investigative process. However, Biden administration officials have said that they agree more action is needed to counter the recent rise in both antisemitism and Islamophobia on college campuses. And they have moved quickly to respond.” Read more.

  • Overhaul of Financial Aid Formula Will Boost Pell Grant Eligibility (Inside Higher Ed)
    “Nearly 220,000 students will gain eligibility for the Pell Grant, a key tool for helping low-income students access college, when the federal government finalizes revisions to the system for applying for financial aid later this year, according to a new report from the State Higher Education Executive Officers Association.

    The increase in Pell-eligible students could mean more than $617 million in additional federal aid going to students and colleges. About $29.8 billion was available for students in Pell Grant funding in 2023–24, according to federal budget documents.” Read more.

 

Other News

 

Generative AI (GenAI) has emerged and is developing far more rapidly than expected. How should universities prepare for the impact and changes that may not even be anticipated?

ChatGPT launched just one year ago on November 30, 2022. Since that date, with the infusion of tens of billions of dollars, tech companies around the world have launched multiple alternative versions of GenAI bots and applications using a variety of Large Language Models. The applications are many and the user base is surging daily. Clara Shih, CEO of Salesforce AI says “In my career, I’ve never seen a technology get adopted this fast. Now, for AI to truly transform how people live and work, organizations must develop AI that is rooted in trust, and easily accessible for everyone to do more enjoyable, productive work.” Salesforce AI released results of a snapshot poll in September, Generative AI Snapshot:

According to the research, generative AI users are a young, engaged, and confident group of “super-users,” meaning they use the technology frequently and believe they are well on their way to mastering it.

    • 65% of generative AI users are Millennials or Gen Z, and 72% are employed.
    • Nearly 6 in 10 users believe they are on their way to mastering the technology.
    • 70% of Gen Z report using the technology and 52% of them trust the technology to help them make informed decisions.

And, users aren’t slowing down anytime soon — 52% say their usage of generative AI is increasing compared to when they first started.

This is not merely casual, personal use of GenAI, rather business and industry has uncovered the efficiency and profitability of using the technology. A Harvard-led study released in September has found that using generative AI helped hundreds of consultants working for the respected Boston Consulting Group (BCG) complete a range of tasks more often, more quickly, and at a higher quality than those who did not use AI. Further, it showed that the lowest performers among the group had the biggest gains when using generative AI. The study which was conducted by data scientists and researchers from Harvard, UPenn’s Wharton School, and MIT, is the first significant study of real usage of generative AI in an enterprise since the enormous success of ChatGPT’s pubic release in November 2022 — which launched a rush among major enterprise companies to figure out the best ways to utilize it.

Lance Eaton, a writer, educator, instructional designer, educational and social media consultant in Providence, Rhode Island created a site at which faculty and administrators at more than 100 universities submitted examples of AI tool syllabi policies they have used for classes. The technology has already become too important for universities to merely respond in an ad hoc way to new developments from a world filled with AI-enhanced program developers. We must be not only reactive to developments, but we must also be proactive in preparing for those developments that are on the horizon.

Universities must create strategic, methodical, predictive processes that take into account capabilities, and reasoned responses, for adoption and support in our institutions. In this process, we have to engage a range of constituencies from experts and developers in this field to students and employers. Collectively, we can effectively share priorities that will enhance and streamline the development and adoption of the most relevant, efficient and effective applications as they become available. 

Identify and appoint a Leader to coordinate and vision the AI initiative

On October 15, Western University of Ontario took a leadership role among academic institutions by installing a Chief Artificial Intelligence Officer, Mark Daley. He is an AI researcher and leader in neural computation. He is charged with developing and implementing a university-wide AI strategy that supports Western’s academic mission and research objectives. Daley says of AI, “There’s a moral imperative for us to not just be aware of it, but to engage and lead. We’re at a very important moment in time where we need to have challenging conversations about everything from regulation versus open source to freedom of expression, as well as the moral, ethical and societal implications of this technology.” Identifying a coordinator and leader such as Western University has is the first step in the process of creating a framework for AI in an institution. It will be important to create such a C-level leadership position at each university to coordinate the policy-making, research, application, and the internal and external teams of experts who will serve to chart the course of AI initiatives. 

Identify and appoint a committee of internal and external experts and representative stakeholders

It is this group from both within and without the university that will identify opportunities and threats emerging in the field of Artificial Intelligence. This committee must meet often to keep the momentum going and to ensure that all emerging applications and practices are carefully examined to determine their relevance and importance to the institution.  The group should include:

  • Faculty and staff from related or interested departments and offices. These internal experts will be able to assess applicability of emerging technologies, practices and applications in the context of the university culture.
  • Top leadership and Human Resources leaders from businesses, agencies, and other entities that regularly employ graduates and certificate completers from the university. These external representatives should be in a position to share current and anticipated priorities for future hires.
  • It is anticipated that the committee will begin by creating a list of guiding principles and practices that will serve as a framework for decision-making about AI.
    • More than 130 higher education organizations, administrators, researchers and faculty members from 44 countries have collaborated to produce a set of core principles to guide development of artificial intelligence policies and practices at college and universities. The statement of principles was released Oct. 9, 2023, at the 18th annual United Nations Internet Governance Forum in Kyoto, Japan. Universities are asked to consider, revise and adopt such a framework:
      • People, not technology, must be at the center of our work
      • We should promote digital inclusion within and beyond our institutions
      • Digital and information literacy is an essential part of a core education
      • AI tools should enhance teaching and learning
      • Learning about technologies is an experiential, lifelong process
      • AI research and development must be done responsibly
    • A librarian who will curate the materials brought to and reviewed by the committee to assure that what will quickly become a large repository of knowledge, information, and perspectives can be effectively and efficiently reviewed and applied by internal and external groups and organizations. The librarian will oversee the process of assuring that faculty, staff, students, and interested or affected external persons and organizations are regularly and fully informed of the committee’s work.
    • Linkages must be created to IT support units, public information offices, academic governance entities, and the provost/VPAA, president/chancellor as well as the governing board offices.

Developing such an administrative structure cannot wait. New technologies and applications are released daily. Universities must identify and support the appropriate effective use and teaching of those technologies and applications to ensure that students are prepared to utilize them upon completion of degrees or certificates from the institution and so that the university can operate as efficiently and effectively as is possible. Who is leading this important initiative at your institution?  Are you as prepared as you believe you should be for the AI age that is upon us?

 

This article was originally published in Inside Higher Ed’s Transforming Teaching & Learning blog. 

If you are trying to pick a new registration system for your Continuing and Professional Education unit, I promise this article will be worth your time. 

I’ve led a number of organizations through this process (three as a decisionmaker and dozens of others in advisory roles) and I can assure you it’s not an overstatement to say that it’s a seismic decision. It requires alignment between leadership, IT, registrar, admissions, programmers, marketing, and your front-line, student-facing team. If done poorly, it has the potential to be disruptive to day-to-day operations during the implementation process and beyond. And, on the flip side, I would say it has the biggest potential to unlock efficiencies for your CE unit and simplify your student journey.

My passion is using my personal experience to give schools all the relevant information they need to make the best possible decision. So, without further ado, here are the things I wish I knew when purchasing registration software for continuing education for the first time. In other words, watch out for these registration red flags.

🚩 1. DON’T be caught off guard by hidden costs…

Right before I left my last Dean position, we were in a full-blown RFP search for a continuing education registration system. At all my past institutions, I’ve had them, tried them, and experienced firsthand the gap that occurs between the time of the demo and after the contract has been signed.

During the demo, it all sounds great. You’ll pay X monthly, for Y number of students, with Z for implementation costs. But soon after the ink is dry, fees start adding up. Need a report you thought was standard? That’s going to cost you. You couldn’t handle something they told you was easy, like data migration? That’ll be another line item.

This happens all the time. ALL THE TIME.

And it’s actually kind of easy for companies to get away with it because the software buying journey is so complex that it often feels like you, the new client, made a mistake, accidentally delayed a process, or just forgot something when drawing up the contract. So most people just shrug their shoulders and pay, even when that’s not fair to them. Don’t believe me? Just ask anyone who has recently undergone an implementation process if they had to pay more than they expected in year one (and we aren’t even talking about subsequent years yet). If you can find someone who paid what they expected, find my contact information at the end of this article and I will Venmo you for a coffee.

🚩 2. …But DO go on the offensive against hidden costs

One way to avoid feeling like you were misled and overcharged starts with your business requirements document (BRD). For those uninitiated, a BRD is just what it sounds like – a list of non-negotiable features you expect out of the software. If it sounds like a pain, that’s because it is – you need to spend a lot of time listing all the features you need within a single document and, ideally, connect a user and a process to each feature.

Be as detailed as possible describing each requirement you are looking for to assure that when the company comes to demo, they can actually show you the functionality and it is then part of the final contract. (More on this later.) 

When it comes time to sign on the dotted line, all functionality should be listed *in the contract* with associated support hours you’re getting as part of the package. That way, you can be confident that 1) the requirement is actually available and 2) they will be there to support you in the learning and use of each requirement, without incurring additional costs. It also helps to speak to a few other clients of the company to learn about what extra costs they may have incurred to make sure you protect your school from similar ones.

🚩 3. Tighten up your requirements list

I know we are backtracking a little bit, but this topic needs its own additional section because of how important it is. A requirements list is the most critical piece of documentation you will need to make the best decision for your organization.

During my time in Continuing Education, I’ve seen hundreds of requirements lists. To me, they are like reading an article from The New Yorker – lengthy, detailed, maybe a bit boring at parts, but overall enjoyable. Because you know that by the end of it, you are going to be excited once everything you’ve read finally comes together in a concrete way.

The most important piece of advice I can give you about your list is to put all the requirements you need right now – but don’t forget to add the “blue sky” features you hope for in the future. For any idea you have that would make a product more helpful to you (think room setup, program success validation, etc.), list it as an item and tell the story of why you want it. 

Then once you have your list, each feature or requirement needs an “owner” or subject-matter expert to accurately describe what it does and why it is needed. If you want extra credit, a workflow would be extremely helpful so any company can visualize the need and figure out why you are asking for that specific feature.

Don’t worry about the length of the document – trust me, they’ve seen longer – because ultimately it pays to be very thorough about the things you are looking for in your next registration system. 

🚩4. Demand to be shown, not told

I already mentioned it, but you want to see each item on your list in action and demo’d in front of you. Yes, the demo can get long, but the more you see, the better. If they say they can do it but don’t show you? It most likely means they don’t have it. 

And if they say that it is on the roadmap, proceed with caution. As we all know, a roadmap is simply a bunch of ideas listed on a document of items the company wishes to put in their product. Take time to ask: How long has it been on the roadmap? How often do they develop features from the roadmap? What percentage are from clients? Those should get you a good idea of the roadmap’s direction.

It’s even better if a company can let you “try-before-you-buy” to ensure the very expensive software you’re purchasing is intuitive and enjoyable to use. This is very common in software-as-a-service (SaaS) outside of Continuing Education, so why should other folks get to have all the fun?

🚩 5. Think tomorrow, not today

Yes, you need to take into account your needs today, but you also need to recognize that your student registration system should grow with you. You wouldn’t take your bike out on the highway, so why would you put your organization on a software platform that can’t grow at your same pace?

It’s the difference between surviving on a flip phone versus harnessing the power of a smartphone. Or Microsoft Clippy versus ChatGPT. It’s a registration platform that can be easily customized (rather than racking up customization fees) and can be continuously iterated on to support your everyday needs and fast changes in the market. 

If you’re about to get on board with a new business, it’s your right (and I would argue your responsibility) to know if you’re hopping on a bullet train or a coal-powered clunker that has years and years of technical debt, kicked down the road with no end in sight. Throw in an acquisition or two, and you have a codebase that’s patched together, fragile and complex. If you want to know why this makes things complicated, think about doing a 500-piece puzzle. Someone tells you they want to add one more piece, but the only way to make it work is to take apart a large portion of the puzzle to make the new piece fit. Rinse, repeat.

Again, I get that some schools like working with more traditional companies that have been in business for decades, and every organization needs to do what is right for them. But know the trade-off you’re making, and get out in front of it and be proactive when asking questions about what features you’ll require and what your expectations for the future should be.

🚩6. And lastly, refuse to settle

Getting into a new CE registration product means entering a new long-term relationship where you’re heavily courted before it officially starts. Year one is a whirlwind, and the subsequent years can feel like a poorly conceived arranged marriage, filled with what-ifs, regrets and dissatisfaction.  

Don’t fall victim to it. Ask for a current client list (and a list of institutions who recently decided to move away from their software) and add those people on LinkedIn to start a conversation. Browse G2, TrustPilot and other software review sites, where you’ll find reviews from real users. And don’t be shy at CE conferences or online members-only forums

Take your time. Have your information ready. Ask a lot of questions. And talk to everyone. If I can help, please reach out. At the very least, you might get a free coffee out of it! 

 

Dr. Meni Sarris is a 25-year veteran and former Dean of Continuing and Professional Education who decided to step away from the Dean’s desk to make a bigger impact on the education landscape overall. As a believer in education beyond degrees (also the name of his podcast!), Meni firmly believes the education landscape will continue to evolve because students want more flexibility, accessibility and personalization when it comes to how they get an education or professionally developed. Currently, Meni lives in Chicago, where he runs the Spur Education Group as the company’s principal, leading a team of CPE, EdTech, and marketing experts. He also is the founder of Redge, a newly launched registration software company offered as part of the Spur Education Group.

GenAI is quickly becoming a daily fixture in the lives of administrators and faculty. It enhances productivity, creativity and perspectives.

In writing this article, I sought the advice of Google Bard, Perplexity and Claude 2. In all of my research using GenAI, I use at least three of the established apps.  This enables me to spot any responses that seem too far out of line or not credible. By spreading my research through multiple Large Language Models, I can better ensure that I am not being led astray. Over time, this may not be necessary, but as the apps are being fine-tuned, I feel most comfortable being able to compare results.

Bard uses the PaLM 2 LLM. ChatGPT+ and Perplexity Copilot use versions of GPT4. Claude 2 is powered by Anthropic’s proprietary LLM. Using multiple chatbots with different underlying Large Language Models helps to provide a diverse set of perspectives and responses to the same prompt.  It only takes a minute to get a full response (even in the case of Google Bard, that by default, gives three draft responses to each prompt). Then, using follow-up prompts, you can drill down for clarifications and citations.  

There are many ways in which we can make effective use of AI every day as higher education faculty and administrators. AI can give us help even with some of the most challenging problems we face. For example, the Undergraduate Dean at John Jay College, Dara N. Byrne, writes that when she started in her position the senior graduation rate was 54%. She used predictive AI to identify students who were most likely to drop out. In a deep data dive, she teamed with DataKind, a nonprofit using data science and AI, to select students for proactive interventions. This resulted in increasing the senior graduation rate to 86%. 

The prompt I submitted to each of the three apps was: “I am writing an article about productive uses of generative AI. It is for an audience of university deans and faculty. This will be published in Inside Higher Education. My goal is to give those who have less experience with GenAI a sense of how it might be useful in their everyday work. Give me examples how these higher education workers could use GenAI.”

Following, are sample excerpts of some of the results from each of three apps I consulted. 

From a search of Claude 2 on October 29:

Research and writing – Generative AI could help faculty and researchers quickly synthesize literature reviews, generate hypotheses, outline papers and proposals, and draft sections of manuscripts. It could save time and enhance productivity. However, human creativity and critical thinking are still essential.

Administrative tasks – For deans and university leaders, generative AI could help draft communications, generate data insights from reports, compile meeting agendas and minutes, and more. It could save time on logistical tasks to focus leadership efforts elsewhere.

Personalized learning – Generative AI tutors could provide customized explanations, practice, and feedback to students, adapting to their strengths and weaknesses. This could promote self-paced mastery and free up faculty time.

The key for higher education is using generative AI thoughtfully and ethically to augment human efforts. With proper oversight and care, it has much potential to enhance teaching, research, creativity, and productivity. But the human touch remains irreplaceable for true learning, discovery, and decision-making.

From Perplexity on October 29:

Automating Tasks

Automated Grading: GenAI can be used to grade assignments, essays, and exams, freeing up faculty time to focus on other tasks. This can be especially useful for large classes where grading can be time-consuming and tedious.

Chatbots: GenAI-powered chatbots can be used to answer student questions, provide support, and even offer personalized recommendations. This can help reduce the workload of faculty and staff, while also improving the student experience.

Improving Student Outcomes

Personalized Learning: GenAI can be used to create personalized learning experiences for students, based on their individual needs and learning styles. This can help improve student outcomes and engagement.

Early Warning Systems: GenAI can be used to identify students who are at risk of dropping out or failing a course, allowing faculty to intervene early and provide support.

Enhancing Research

Data Analysis: GenAI can be used to analyze large datasets, such as those generated by scientific research. This can help researchers identify patterns and insights that might be missed by human analysis.

Natural Language Processing: GenAI can be used to analyze and understand large amounts of text, such as research papers, articles, and books. This can help researchers identify trends and insights that might be missed by human analysis.

As you see, the emphases of the results vary considerably. Of course, different prompts will result in different responses and different apps will emphasize different interpretations of your prompt. The results to an identical prompt in the same app also will be different if run again just a few minutes after the first one.

The primary key to success in use of GenAI is to carefully develop the prompt. However, the second key to success is to use more than one GenAI app. A third key to success is to use Generative AI often, daily if possible. Through repetition, you will refine your prompt-writing and determine which app is likely to give you the response you seek for the questions and tasks you have.

This article was originally published in Inside Higher Ed’s Transforming Teaching & Learning blog. 

A recent survey of students who stopped out of college suggests that giving them the opportunity to earn microcredentials and academic credit for prior learning could lure them back to higher ed.

The survey of roughly 1,100 former students ages 18 to 64 was conducted in June by StraighterLine, an online course provider, and UPCEA, an association that focuses on professional, online and continuing education. The resulting study, released this week, explored when and why students left college and what factors could prompt them to return.

The study comes at a time when the number of American adults with some college credits but no credential has grown to a whopping 40.4 million, according to 2021 data from the latest National Student Clearinghouse Research Center report. 

Read the full article.

A new survey of adults who’ve dropped out of college finds that 57% have completed about half or more of the credits needed to complete a degree program. Of this group, the majority indicated they wanted to return to college to finish their degrees.

Those are two of the main finding from a survey conducted by UPCEA and StraighterLine, a provider of online college courses. The study, entitled “Disengaged Learners & Return Paths to Higher Education,” was conducted in June, 2023 using an internet panel of about 1,1oo adults between the ages of 18-24 who had been enrolled in college at one time but stopped out before completing a degree. Read more.