Scire responsum non significat intelligere quaestionem.
Knowing the answer doesn’t mean understanding the question.
I tasked that inquiry to AI, using a specific language model for learning.
Why is it relevant to creating a use case and then sending it over to the reader or the person tasked with responses?
They see the use case. They supposedly know their system (this is highly questionable as product dev doesn’t always tell them what is in the pipeline to roll out OR the responder often the salesperson doesn’t know the system inside and out and thus relies on someone else, say a solutions consultant, to do their job for them, without any Q/A (Quality Assurance) to validate.
What ends up then is a use case seen by someone who ‘knows’ the answer, yet in a way unbeknownost to you, doesn’t really understand the question.
Your use case is the context, but what if someone new to the industry, and new to learning systems, is presented with this use case?
Will they really understand the question?
A high probability says no.
What then is missing in the use case to provide the answers that the person, instead, is presented with the question and in return presents the answers accordingly?
Nobody likes writing use cases. They can be an experience of dread without procrastionation taking hold.
Your Subject Matter Experts present feedback that is usually jargon and statements only they know the answer to, again, without understanding the question.
Then you have other departments, or worse, a committee.
Each individual states what they need, knowing full well (or at least they should) that this is not a buffet where everything they want goes into the use case.
The problem, though, is that the person writing the use case may shove all those items into the case, which then becomes a new book written by James Joyce (good luck with that read).
Off goes your use case.
Into the hands of a person who reads it and would never say they lack understanding of what you need, or again, in a real-life scenario, some vendors have ready-made responses as a plug-in and send back (common with those RFPs you send thereafter or before your use case).
Yes, a use case has to be specific, but it needs to understand the question, even if you, the writer, know the answer.
The answer isn’t the system itself – that is why the use case probe is being sent – rather the answer is the use case that you wrote based on what you know, assume to know, and therefore should be clear to anyone.
Can AI then assist in a manner that eliminates the extensiveness, and present the findings in a clear and concise format, which in today’s world recognizes the speed read approach.
And in return, can the person who is basing their chat with you, on the use case itself, if the questions are presented in a way that really adds depth?
Let’s be real, there are people who ask AI to create their use case for them, based on a set of criteria, or maybe the output becomes a template that the person overseeing the use case pulls a plug in and saves time and goes format.
The vendor, in turn, may take that extended use case and stick it into any of the free AI tools, request a summary, and tada – fast and efficient.
Ignoring the many mistakes aspect, hence the need to do a re-check.
While there are plenty of sites and people providing AI prompts for anyone to use, and even teaching people how to use prompts – format, structure, etc. – none of these will really help you with your use case.
All this goes back to the Latin statement – Scire responsum non significat intelligere quaestionem.
You do not need to create an agent or use an agent (you may have heard that term) to generate various responses that can be sent in place of your use case.
Ask more, and see more.
View the Entire Booklet – The Prompts I used, the results and more – Knowing (available for download)
Your Use Case
Recognizing that it is all about speed these days, and limited focus, I wrote a prompt with a series of words (this is after I uploaded my use case – you can find a multimodal (basically it accepts files, etc.) which in turn outputs a response.
Summarize use case, including specific requirements by user role, key challenges, explicit and implicit goals, expected outcomes and essential questions to address with potential LMS vendors.
The focus here is to get right down to the nitty-gritty.
Based on my prompt, the response starts with a very high level and short executive summary (remember this is going to the vendor)
Prompt: Summarize the use case in a digestible format, identifying the challenges, risks, outcomes, and impact of learning for the employees, customers, and the company itself.
Present results in a bullet format with headers for each section, reading time no longer than 15 minutes. The reader will have limited information about our company.
For these results you will see information that can be sent to the vendor, information for your internal processes and purposes, and data points for you to track.
What types of items do I place into my prompt or prompts?
Here is a short list that I used with my prompts, drilling down where needed
- What is the overall goal – outcome?
- What are the challenges? – From IT to Implementation to Context of learning?
- What are the key takeaways?
- What are the security and infrastructure issues?
- What do you want the new system to solve?
- How can a vendor understand what you are presenting?
- How can they respond, whereas they are responding and not using AI in any manner in conjunction with their follow-ups?
- Will their answers spur you to think of additional questions to address those inquiries?
Do I or Do I not?
There is no doubt there will be people who wonder whether they should send their use case over to the vendor anyway, even with these specifics.
My response is yes. Think of it as a way for them to get additional context.
The goal of the use case prompts is to present specific statements, summaries, questions and other information they may not be able to extract.
Or if they do, to ensure everything is crystal clear.
If you have specific implementations, state that in your use case.
“We use Workday, ADP and WidgetSystem, which will need to be integrated into the LMS (or any other type of learning system”.
The Use Case Analysis
My prompt
Present use case in short bytes for a quick overview for the reader, design in a micro-learning approach, with objectives followed by preferred goals, stakeholders involved, challenges overall and specific job roles, and outcome required
Then I was able to break it down further, including changing the format of specifics say outcomes vs challenges in a compare and contrast with a table using icons where applicable.
You can really do quite a bit, as long as you have the initial information – which you do – its that use case – and then tapping into the ideal prompts to get you to the next stage
Salesperson Questions
Remember AI today, cannot provide deep thinking, thus to ensure that a team of folks at the company (the vendor’s system) will need to work together to respond accordingly, you can set up a prompt to develop the questions (always review before accepting) and expectations.
I added some details including level of product knowledge from the salesperson, and total amount of time to read.
Before going on to the demo, knowing the answers to these questions would be extremely important. Make it a requirement they must respond by X date (typically two to three weeks) for consideration.
The question headers (specific questions under each header)
- Integration and technical architecture
- Implementation and Reality
- Scability and Performance
- User Adoption and Change Management
- Competive Differantion
- Risk and Failure Mode
Let’s look at one question under the user adoption section
Manager Dashboard Abandonment Pattern
Based on your customer data, what percentage of managers log into their dashboards more than once per month after the initial 90-day launch period?
What distinguishes high-engagement manager populations from those who never return—and what intervention strategies have proven effective to combat dashboard abandonment?
The use case presents this as an issue that needs to be resolved – but while specific it opens up the vendor’s response in terms of generality.
This question goes further in, again, requiring the vendor to respond in a real world environment – not a general everybody does this retort.
Let’s say though you want a different approach and style of questions to be sent over to your salesperson.
My Prompt
Present a series of questions, based on the response that requires the reader to provide in-depth answers, validating that they understand the information presented to them.
The questions should be clear and concise. Show each question in a bullet point format with spacing between each question and a header.
The findings – i.e. the questions, I felt were better than my initial sales questions depending on your variance and choices.
Example of another type of sales questions (think a different approach)
My Prompt
Present a series of questions, based on the response that requires the reader to provide in-depth answers, validating that they understand the information presented to them. The questions should be clear and concise. Show each question in a bullet point format with spacing between each question and a header
Output Example
Question – Explain how the absence of HR system integration creates a cascading effect of administrative challenges throughout the LMS. In your answer, identify at least three specific downstream impacts beyond just manual user creation.
This is going to the salesperson – they receive the information – the questions and thus need to respond.
Another example – once again, from your use case
Employee Impact Trade-Offs
Question – While the LMS provides numerous benefits to employees, identify two potential negative impacts on employee experience that are not explicitly mentioned in the document. Explain how these could undermine adoption despite the positive features.
Prompt: Summarize the use case in a digestible format, identifying the challenges, risks, outcomes, and impact of learning for the employees, customers, and the company itself.
Present results in a bullet format with headers for each section, reading time no longer than 15 minutes. The reader will have limited information about our company.
For these results you will see information that can be sent to the vendor, information for your internal processes and purposes, and data points for you to track.
Additionally you can dive further in, and extrapolate the info of use to you, and the vendor themselves.
Sending the use case with questions for the vendor to respond to.
Impact of Learning
Prompt : With my use case what will be the impact of learning to business?
Pretty specific, right to the point and uses the recommended approach of Impact of Learning, rather than the usual ROI, which doesn’t apply well enough for showing the pluses (which you always want to do)
I followed up by asking ‘Summarize the text, specific bullet points for each category – Takeaways, no more than a five minute read – highlight the big wins.’
As you can see, I asked for specific information that I wanted extracted and a maximum time to read it.
Big Wins
- Transforms training into a revenue generator and strategic differentiator
- Drives measurable improvements in efficiency, compliance, retention and client value
- Enables rapid scaling and innovation without proportional increases in headcount
- Supports data-driven talent and business decisions
Now ask yourself what would you rather have a senior exec see – A general ROI with no specific outcomes or a “Big Wins” tied to the business itself?
What else did the prompt provide me?
Quite a bit, but I am only going to highlight an area, any business leader would buy into
Talent Development and Employee Retention
- Personalized Learning Paths – Course recommendations based on roles and skills reduce turnover by 10% to 15% (500K to 1M saved/year)
- Skills dashboards – Employees see their growth, increasing internal promtions and reducing recruiting costs by 20% to 30%
- Just-in-time Learning: Faster onboarding (from 90 to 60 days) and immediate access to sales/product traianing
- Knowledge Retention – Spaced repetition and manager follow-up double retention rates and life sales quota attainment by 15% to 20%
- Peer Learning and Gamification – Community features and competitive elements boost engagement and participation by up to 60 percent.
Again, all of this tied around the impact of learning and using specific prompts – all from your use case.
These prompts are not in the usual hey do this or here is a list of generality – rather it is knowing your subject, and inquiring to see actionable results and possibilities.
Can I do this with customer training/client training?
Absolutely. This isn’t limited only to employees, or a business that has internal and external (as my use case example does).
If you plan to charge X price for the content state that, and follow up with the impact of learning prompt.
I’d even ask something like to generate $250,000 for the year, what should my mark-up be (increase between the actual cost to you for the content – think build) and what you charge.
On a personal note, I can state that you must break-even. Thus, if the vendor is charging you $25 per learner (per year), then you can charge $39.99 and generate revenue.
The problem folks have is that they are either greedy or unaware and thus assume a higher price point will boost better numbers.
Sure, but you want to build mass here – use the Blue Ocean Strategy – I did, and generated nearly 1M in nine months – of pure profit.
What did I charge per course? $49.
Remember when Apple launched their music store? What did they charge per song? 99 cents.
What strategy did they use? Blue Ocean.
What if we have requirements with our use case, or just a set of requirements in an Excel file?
Not a problem.
Based on the prompt you can generate results and export as an Excel file.
Then use Copilot if you have it or create it in a visual style and format. Or have your Gen AI or Agentic AI create it as a PDF file, or retain the table for download.
In today’s world, more and more reorganizations are impacting L&D and Training departments. The system ends up going to HRIS (often) or another department.
The person or people that will oversee the system, lack any knowledge around L&D and Training.
You still have the use case, or in this example an Excel or Google Sheets requirements.
What can you do?
Prompt: Shorten response design it to be read by someone who is working in HRIS with zero experience in using or buying an LMS. Be clear and concise. Focus only on relevant information in a summary that includes bullet points. Each section should be no more than 10 words. Write it in English
Let’s say though you want the response to be in Latin American Spanish.
Simply replace “Write in English” to “Write in Latin American Spanish”.
Proposals
Understanding and comparing proposal results can in itself be a tiresome and confusing process. Even savvy folks who have purchased systems before easily can be fooled into thinking one way, forgetting that the real (what you pay), is something completely different.
Then there are people new to the whole learning system space (learning tech too – after all the prompts can be for any use case or specs), and are required to receive these proposals.
Which one do they pick if they lack the knowledge of how a system works, what are the benefits directly to them, and the costs.
I used three actual proposals that I received. The proposals are from 2016 (thus do not tie the actual results, i.e. costs and recommendations based on the findings). Again, this is for reading/example purposes only.
Prompt: Summarize with recommendations, nor more than one minute read (high level)
Prompt: Based on the EXAMPLEXXXS use case requirements and three vendor proposals, provide a comprehensive analysis comparing costs, capabilities, and alignment with our business needs.
Bottom Line
Using general prompts you can find on the internet only works in our industry if you know your use case, present it, and seek specific insight which will be sent to the vendor.
Always remember time is essential these days, speed a necessity and mistakes can happen – which impacts your response.
Make sure to review your results prior to sending to verify that the information presented is accurate.
Despite what you hear, no Gen AI or Agentic AI is 100% accurate all the time. They may make mistakes.
The impact to you if you ignore that, is what happens when an employee is taught or provided with a list of prompts.
A real possibility that the output isn’t real
The prompts, yes.
The answers, no.
E-Learning 24/7







