Name of person/ project that you are providing feedback for: _________ 1. Briefly describe the training and evaluation  2. Comment generally on

Name of person/ project that you are providing feedback for: _________

1. Briefly describe the training and evaluation 
2. Comment generally on what you like about your colleagues training module and if you think their evaluation can capture efficacy in the training. (If not, what could strengthen their evaluation/evaluation process)?
3. Share at least 2 aspects of the training that you find most effective and tell us why. 
4. If you have a critique/critiques, please share those and accompany your critique with suggestions for an iteration.
5. Finally, share 2 specific suggestions on how to make the best aspects of the training even better.
6. OPTIONAL: Any additional comments/thoughts/constructive suggestions, please share them

I have picked two persons you will reply to and put their post on here for you to respond to and have placed their posts in this below:



I posted the second one as a powerpoint.

____________________________________________________________________________________

Below is an example of what it should look like:

Name of person/ project that you are providing feedback for: 

Eddie Almada De La Vega

1. Briefly describe the training and evaluation

This training module, 
“Data Fluency for Decision Makers,” is tailored for non-technical business users at Kueski who need to develop foundational SQL skills to support faster, data-driven decision-making. The course addresses a key organizational bottleneck: the overburdening of the Analytics Engineering team with ad hoc requests from teams that lack data fluency. Through three interactive activities—writing SQL queries to solve real business problems, peer review and query redesign, and dashboard creation—the training scaffolds learning in a hands-on, outcome-focused manner. Participants use actual company data within Databricks, enabling immediate application to real-world scenarios. Evaluation occurs via Google Forms and includes activity-based assessments and a final training evaluation form. These tools capture learning outcomes, comprehension, and learner feedback. The training aligns with Kueski’s strategic mission to become a data-first fintech company by empowering decision-makers with the autonomy to generate insights, enhance agility, and reduce dependency on centralized analytics support.

2. Comment generally on what you like about your colleague’s training module and if you think their evaluation can capture efficacy in the training

Eduardo’s training module is thoughtfully crafted and aligns well with organizational goals and participant learning objectives. I particularly appreciate the real-world relevance of the content—participants don’t just learn abstract SQL concepts but apply them immediately to business challenges like tracking GMV or fraud rejections. This practical, problem-based learning model ensures that participants see the immediate value of their skills. The evaluation approach—via activity-based Google Forms—helps measure comprehension and engagement during each phase. However, while the assessments capture skill acquisition, they may not fully capture long-term behavior change or decision-making impact. To strengthen the evaluation, I recommend introducing a pre- and post-training quiz to measure knowledge gains and confidence levels. In addition, integrating follow-up manager feedback or observational check-ins (e.g., 30 days later) could help assess the degree to which learners are applying skills on the job. These enhancements would help demonstrate both efficacy and return on training investment.

 

3. Share at least 2 aspects of the training that you find most effective and tell us why

Two elements of this training stand out as especially effective: (1) the use of real data from Kueski’s actual Databricks environment and (2) the peer review and redesign activity. By requiring learners to interact with real business data—rather than hypothetical examples—the training immediately boosts relevancy and confidence. Participants gain fluency in querying the exact tables and fields they’ll encounter on the job, which lowers barriers to adoption and accelerates practical usage. The peer review and redesign task is equally powerful, fostering a culture of collaboration and shared learning. This not only sharpens SQL logic through exposure to alternative solutions but also encourages critical thinking around readability, accuracy, and performance. Learners are encouraged to see query writing not just as a technical task, but as a strategic, iterative process. Both components foster applied learning and cross-functional understanding, crucial to building a truly data-driven decision-making culture across the organization.

 

4. If you have a critique/critiques, please share those and accompany your critique with suggestions for an iteration

One potential critique of the module is that while the technical exercises are excellent, the training assumes a baseline familiarity with tools like Databricks and SQL editors, which may not be true for all participants. For those completely new to querying or data environments, the pace and complexity could feel overwhelming. To make the learning more inclusive, I suggest adding a 
short optional pre-module or self-paced onboarding video introducing Databricks basics, SQL syntax, and navigation tips. This would level the playing field and help ensure all learners start with a shared foundation. Additionally, while the activities focus well on 
what to do, there could be greater emphasis on 

why
 certain SQL strategies are more efficient or insightful in a given context. Including a “trainer’s insights” section or a
 post-activity debrief after each task would provide additional context and depth, reinforcing strategic thinking alongside technical execution.

 

5. Finally, share 2 specific suggestions on how to make the best aspects of the training even better

To enhance the best aspects of this training—particularly the hands-on data querying and peer collaboration—I suggest the following two improvements. First, include a curated library of example queries or “SQL patterns” participants can use as references post-training. These patterns could map to common business questions and promote knowledge retention while accelerating real-world application. Second, build a peer support structure post-training, such as an internal SQL Slack channel or weekly office hours where learners can share dashboards, ask questions, and receive mentorship. This would extend learning beyond the classroom, strengthen cross-functional data fluency, and build a supportive culture of continuous improvement. These suggestions would help in creating the training even better with lasting skill development and organizational impact.

2nd example

Name of person/ project that you are providing feedback for: Lavonzell Nicholson

1.
Briefly describe the training and evaluation

Lavonzell’s session kicks off a three-part series designed to help staff feel more confident using AI, especially when it comes to writing better prompts. Module 1 lays the foundation by introducing a simple but powerful structure for building prompts: six key elements (Task, Context, Role, Instructions, Expectations, and Example).  The training is super practical. It’s not just theory, rather participants actually get to break down weak prompts, improve them, and apply what they’ve learned to their own work. There’s no formal test, but the learning is evaluated through hands-on practice and real-time feedback, which aligns really well with how adults learn best.

2.
Comment generally on what you like about your colleagues training module and if you think their evaluation can capture efficacy in the training. (If not, what could strengthen their evaluation/evaluation process)?

What I really liked was how clearly the training connects to both a real need (low confidence with AI) and a bigger goal (producing faster, sharper research). The structure is clean, the examples are relevant, and the activities are useful right away.  Using prompt analysis and rewriting as a way to evaluate learning is a smart move, and it’s low-pressure but still shows whether people are getting it. A quick self-check or peer review could help reinforce the learning, even a few reflection prompts like “Can I spot all six components in a prompt?” would go a long way in helping people track their growth.

3.
Share at least 2 aspects of the training that you find most effective and tell us why.

1.
The six-block framework: It breaks down something that can feel abstract (writing good AI prompts) into clear, repeatable steps. It’s easy to remember and immediately useful and its exactly what adult learners need.

2.
Ties to the bigger picture: The training doesn’t feel random or inconsistent. It’s clearly part of a larger effort to help CRPE work smarter and faster. That’s the kind of alignment that helps people see the value and stay engaged. It’s not just “learn this tool,” it’s “here’s how this helps us do better work.”

4.
If you have a critique/critiques, please share those and accompany your critique with suggestions for an iteration.

The training could benefit from a bit more interaction. The exercises are a great addition, but adding some peer discussion or group critique (even virtually) would expand the learning. Tools like shared docs, polls, or breakout rooms could make it more dynamic.  Also, it might help to show the full roadmap of the three-part series early on. A simple “You are here” slide with a quick overview of what’s coming next would help learners see how this module fits into the bigger journey. And a downloadable cheat sheet or job aid would be a great resource for folks to reference later.

5.
Finally, share 2 specific suggestions on how to make the best aspects of the training even better.

1.
Make prompt comparisons into a mini-game: Turn the strong vs. weak prompt examples into a quick, interactive game. Show a few anonymous prompts and have people vote on which ones check all six blocks. It’s fun, low-stakes, and reinforces the framework.

2.
Create a “Prompt Planner” worksheet: Give learners a simple template where they can fill in each of the six blocks before writing a prompt. It’s a great way to build muscle memory and helps them apply the framework consistently, especially as they move into the more advanced modules.

·
Reply to post from Robert Puckett IiReply

·
Mark as UnreadMark as Unread

Share This Post

Email
WhatsApp
Facebook
Twitter
LinkedIn
Pinterest
Reddit

Order a Similar Paper and get 15% Discount on your First Order

Related Questions

Module 3 – Case Project Management Case Assignment For this Case

Module 3 – Case Project Management Case Assignment For this Case Assignment, you will develop a project management plan based on the company you selected to create in Module 1. The basic processes of project planning are typically: 1. Identify the goal of the project. 2. Map out the scope.