To preface this course this is a thought provoking piece- not a complete course but a conversation about AI and instructional design. The sample is not supposed to teach rather show the tools available in this new digital landscape and open a dialogue.
As AI continues its dominance into every sector of life, the role this technology will play on the field of instructional design has yet to be fully actualized. Some see AI, as support tool while others see AI as a replacement for education and instruction. This is a course that explores what it looks like when we replace the human with a machine and what it's implications are instructional design.
-
Responsibilities: AI prompt writer
-
Target Audience: Human beings learning about budgeting
-
Tools: Gemini, ChatGBT, Colossyan
-
Budget: low
-
Year: 2025
Background
In corporate settings, countless training hours are spent exploring and adopting new AI tools. Senior leaders push for its adoption at every level promising with adoption bottom lines will grow. But to many in human centered fields, we see the pitfalls of this technology clear as day.
​
As an instructional design, I have wondered if by adopting these tools am I pushing myself out of existence or am I created AI slop and calling it effective instruction? This thought led me to a personal exploration on AI and its impact on the field of instructional design.
​
​
Exploring Prompting
To begin I chose a topic- a simple budgeting course that focused on putting money into four buckets, one for essentials, one for lifestyle, one for short term savings, and the last one for long term savings.
​
I began with using Gemini to create a project outline based on the following prompt.
​
Prompt: You are a seasoned instructional designer tasked with creating a human centered course on budgeting and personal finance. I would like an outline for a course that focuses on the four buckets of budgeting (essentials, short term, long term, and spending money). The course should be no longer than 15 minutes in length. Assume your audience is adults over the age of 18 with general understanding of financial terms. Do not hold back or worry about hurting my feelings. ​
​
***LLM insights- LLMS like Gemini are designed to agree and to validate feelings- I wanted a direct an honest answer so I used the end line "Do not hold back or worry about hurting my feelings***
​
Gemini produced the following outline.
​​
​
​
​
​
​
​
​
​
​
​​



Designing the Module
Since I was focused on what AI would do I continued this prompting to build out each of the slides in the course. The words used in the module are all AI.
​
Personal Takeaways for Prompting: I purposely chose the topic of budgeting for its duality. One one side it is a logical practice yet on the other side it is an emotional one where people make monetary decisions based on feelings. Not all financial decision are purely logical and not all decision are purely emotional. As a seasoned instructional designer and human being I know this to be true because of my experience and the experience of other human beings. AI has not experienced the emotionality of money and the way emotions shape financial decisions and stability. It looks at money as only a logical problem. Nowhere in the outline does it talk about what to do you if you fail, no where in the outline does it talk about disappointment of not meeting goals and how to address it, nowhere in the outline does it talk about the emotions and interpersonal interaction financial decisions can cause.
At first I chalked this up to a prompting error, but the more I prompted an emotional response the more Gemini's answers felt unrealistic, robotic, or just plain off. I wrote prompts over and over to illicit a response that felt more human and real. However the time I took to do this was extensively more than if I had just wrote the slides myself.
My overall takeaway is that using AI for prompting can be a valuable tool for straight forward information but the human touch of experience, relatability, and emotional depth is one of the most valuable tools an instructional designer brings to their work.
As I continue to learn more about AI, AI prompting, and AGI the more I would like to see how instructional designers use their talents to bridge the human experience with these new tools.
​
​
​
​
​
​
​
​
​
​
​​
During my time at Optum, I was privileged to explore emerging technologies one of which was Colossyan.
Colossyan is an AI video platform which allows users to create training and communication videos using virtual avatars. It has the capability to do text to speech voice over and translate training content into multiple languages and add interactive elements to video content. It is a very powerful tool which was used by some departments in my previous organization.
​
I wanted to explore Colossyan for this project, because in the past I have designed and worked on projects that used human being filmed with a green screen. I later took that recording and created background images and visual supports. The process can be tedious and looked at ways to make the process easier.
Using this emerging technology, I thought of making a digital clone of one of our in person corporate trainers. I could then use that clone to present information to learners so they could have continuity between in person and virtual training experiences. While I never got this far, I did use premade avatars with in Colossyan's database to see my idea to fruition. If I like these avatars, I could champion having our in person trainers lend their name, image, and likeness to this program. Many of the companies already have their C-Suite members scans and use this program to video send message to employees enterprise wide. It is not far fetched to explore this idea at a smaller scale. However this technology is not without a downside.

Real Human on Green Screen

AI Generated Human in Colossyan
In the images of above we can see the difference in visual quality. While the AI image stands out for its clarity and crisp coloring, when played back the avatar is noticeably not human. Its forced and repetitive movements and lack of emotional physical response are dead give aways that this is AI (this rectified in the paid versions to a certain extent but is not fool proof). While this is cost effective, it may come across to leaners as inauthentic and may cause them to disengage with the video all together.
​
There is also ethic dilemmas for making digital clones- who own this clone- the enterprise or the person who has been digitally cloned?
Personal Takeaway for Developing: There is high potential for this software to gain popularity and rapid use in the future. I could see C-Suite executives, spokespeople, and in-person trainers being scanned into this program. I could see instructional designers using those digital clones to create training materials. With busy schedules it may be easier to scan a person one time and then have a training and development teams create messaging and learning materials using these digital clones. While that can be great in the short term it can also have drastic consequences we may not be thinking about in the present moment.
I could see a lot of misuse with programs like this as technology becomes better and better. If this program grows and become so good as to not be detected as AI, how can employees tell the difference between the real words of an executive and that of AI digital clone? Could these digital clones be hacked? What if a disgruntled employee with access to these digital clones send out a terrible message to an entire enterprise- how will an organization handle this? How will those who have been scanned be protected against misuse or infringement of their identity? How does a company address name, image, likeness concerns? What is the protocol for handling this? What if your digital clone says something you are not in personal alignment with? What is the ethical and moral standing of such a technology?
​
This type of technology unearths a lot of questions to which I don't know if we have thought about all the answers.
While this technology provides a lot of opportunity there is an ethical undercurrent that needs to be properly addressed.
​
​
Takeaways
AI is already apart of the instructional design landscape and will be for the foreseeable future but it is important to note that it is a tool not a substitute. Organization that view AI as a substitute for instructional design are missing the point. People don't take time out of their busy days to attend trainings to be wooed by AI creations, they take time out of their schedules to attend trainings to gain information that helps them become better, whether that is in their job function, their interpersonal relationships, or to acquire new knowledge.
​
Learning at its core is a very personal thing. It is identifying a personal deficit, having the courage explore ways to fill that deficit, setting aside ego, and allowing others to pour their knowledge into you. Learning is a personal environment which encourages growth and failure simultaneously. It is a very powerful thing unto itself.
Learning also emotional. It can make the learner feel frustration, anxiety, depression, joy, pride, and even euphoria. Learning is not only acquiring information but it is deeply emotional experience. Learning for all intents and purposes is a core human experience.
​
When it is all done by machines we lose the human element and experience of learning. We perceive it as off or uncanny. We know it is part right and part wrong.
​
For instructional designers are goal is use AI as a tool so that we can focus our efforts on experience of learning and impact of that learning on our learners.
​
Until AI is fully human instructional designers are here to stay.
​
​
