Amanda Lester

Amanda Lester

Amanda Lester is the Associate Director for Member Engagement and Support at AACTE. She has extensive experience as an educator and with leading programs and improvement science initiatives.

Location United States

Activity

  • Thank you for participating in the course, Sheelan! We are happy to learn that it was of help with you professional goals! :)

  • Thanks for this comment, Pippa. Often, it has been argued that qualitative data may have less reliability than quantitative data because of the possibility that it can be influenced by opinion or subjectivity. With this in mind, are there ways that the influence of opinion on the analysis of quantitative data can be reduced or mitigated, and are there...

  • Thanks for the great examples of what has worked, what has caused problems/challenges and tools and processes that you have questions about! I also benefitted from some of the suggestions with a key take-away that any tool used needs to be clearly portrayed, easy to navigate and clearly tied to advancing the action necessary to apply findings directly as a...

  • Thanks for sharing this resource, Miriam!

  • There are wonderful comments in this feed that help to set the stage for the next lessons in Week 3. One point that particularly jumped out at me was a discussion about what constitutes "good" findings. Because that concerns the phrasing used to introduce this section, I think it would be helpful to reflect on the use of the term, "good" in this context....

  • These are excellent comments! The concept of "good" can also imply usefulness of data as opposed to positive outcomes. In week 1, the influence of error was a recurrent and valuable discussion point in terms of thinking about how it can serve as a reflective tool in guiding the data collection and analysis process. In this sense, findings too can become...

  • You raise an very interesting question. It would also be interesting to delve into other possible effects of earlier testing results that could have influenced this shift, including modifications to the standardized test over time, as well as the effect of outside preparation resources that may be more readily available to families whose children may attend...

  • Thank you for your point which highlights the importance of having a clear and stable context for analysis as a prerequisite for reliably interpreting data. This point was raised in a few other comments as well. Without the presence of a reliable and consistent frame for analyzing data, it is a challenge to both build a hypothesis that can be applied...

  • Thanks for yiour comment, Pippa. I think you have raised a good point about the need for presenters to provide a thorough context for recipients to gain meaning from the data they present without the need for significant additional explaination. I would assert that expecting that is not picky at all, because it is the repsonsibility of the presenter to ensure...

  • A key point made through the comments in this feed is that the relevance of data to a recipient is dependent upon ensuring that they have a clear and appropriate understanding of the context in which the data was collected, is being presented, and is situated in comparison to the goals for which it was collected and analyzed. Within this feed there are...

  • After reading through all of the comments to date, while sometimes expressed through different examples, the key take-away is that data should not be shared or displayed in a way that compromises or imposes personal suffering or risk for the individual represented by the data. Some examples about what happens or can happen when this consideration is not...

  • Amanda Lester made a comment

    Thank you all for sharing what you have found. It is very interesting to see the wide range of development and availability of data privacy laws between locations. How might these variations impact the successful use of previous data uses that have been explored when data privacy laws/regulation is either very strict or very loosely defined?

  • Thanks for your question- conversations in teacher preparation about the best way to measure dispositional change-- and even if it can truly be measured are ongoing. There are many examples of approaches being used to evaluate and measure changes in teacher candidate dispositions that can be found online, as well as a wealth of scholarly articles on the topic....

  • The challenge you identify is one that has been expressed in many ways throughout the course by other participants. Sometimes external demands and time constraints can end up influencing or controlling the depth to which certain types of data can be analyzed. If possible, in addition to grades, adding other types of intentionally developed opportunities for...

  • Thanks for this comment. Acknowledging the existence and/or potential for unexpected results is important to ensuring that the use of the data stays true to the goals, objectives and metrics identified at the outset of the process.

  • Hi Kelvin, Thanks for raising this question.

    For this example, the key ways in which the Baldrige model was applied that are shown through the results cited in the pdf, are reflected in how they reported their results in comparison similar, external benchmarks (ex. "In 2010, half of MCPS graduates received a college-ready score of 3 or higher on at least...

  • Hi Adam- thank you for raising this point. Successfully enacting new processes for improvement are often reliant upon the culture and context in which they occur, and informed and intentional leadership of such efforts is critical to set the stage for participants to feel empowered to participate.

    Does anyone have an example they can share about how they...

  • Thanks very much for this important comment. I agree that addressing the variation in the problem and then determining when and where identified "solutions" can be applied is often a skipped step, particularly in the rush to implement a new and exciting approach or practice that seems to take into account some of the most prevalent problems of practice. How...

  • Hi Rebecca- thanks very much for your comment. How do you imagine the outcome of your analysis might be effected if the analysis was done by a group of educators with similar goals for improvement? Do you think it would be more helpful to determine how decisions about using data to inform practice are made or do you think that reviewing your data in a group...

  • How could bringing different perspectives and interpretations of data into an analysis effect the overall quality and breadth of the decisions made about how to apply what was learned to improve practice?

  • Taking on this task alone is very challenging. Perhaps starting with a single or a few SMART goal(s) and then finding a colleague (even from a different educational discipline) who is familiar with the process to discuss your findings with might be of help to begin to determine how what you learn from your analysis can be applied for improvement or would...

  • Lynn- thank you very much for your comment. The issues you raise related to using standardized testing data to inform changes in instructional practice raise an interesting point about the selection of data points when developing SMART goals and objectives. While often, teachers are expected to use this data to improve student learning, within the context of...

  • How has a more lax implementation of the last two steps effected the usefulness of the data collected earlier in the cycle?

  • Thank you for your question. Analyzing and discussing results and using data for improvement occur as part of the broader implementation plan to move through each phase of the improvement cycle, with the ultimate intention of analyzing the data collected for the purpose of determining how to apply the results of that analysis to achieve the goals and...

  • As more comments have been added to this feed, the challenge of balancing professional time demands with the time intensive nature of some data collection and analysis processes continues to be a strong theme. How might this be better addressed during the planning stages of an assessment cycle?

  • Thanks, Valerie! The example you provided demonstrates how data collection through formative assessment can be designed to measure incremental progress in a way that allows for continuous improvement, in addition to more comprehensive or summative assessments meant to measure growth over longer periods of time. Combining larger and smaller loops of data...

  • Thanks so much, Kofi. I am appreciative for your feedback and very glad that the course is helpful in achieving your learning goals. :)

  • This is an excellent point and question. What types of data should drive decisions about improvement and for what reasons? While stakeholder feedback is critical to ensuring that those most effected by change decisions are most benefitted and not negatively impacted, what role does critical feedback play as a driver for change, particularly when stakeholder...

  • Thanks, Ricardo! :)

  • No apologies needed, Ricardo :) Welcome, and I am very happy that you have been able to access the course and threads! I also hope that all is better with your cell phone and WifFi dilemmas.

  • Two new questions/themes have emerged as I have read back through the comments added since my last general post on this topic--
    1. Could 'closing the loop' have different meanings or 'look' different based upon the expectations, contexts, and institutional beliefs that are tied into the purpose for selecting, collecting, and analyzing the data?
    2. Is it...

  • Thanks, Sean, for your insightful post that broadens the concept of data to include the more fluid interactions and sharing of knowledge that happens on a daily basis to influence more informal decision-making, that nonetheless, can sometimes have equal or greater influence on the outcomes of improvement efforts. Without leaving Romid's initial question, I...

  • Hi Gretchen, Thanks for this comment- it raises some important questions about how to best protect data privacy when dealing with small data sets.

  • Hi Romid, I agree that in some contexts, delayed implementation could effect the efficacy of decisions made about improvement strategies, particularly if there are significant internal and/or external policy shifts that later negate the path of improvement that is indicted by trends in the data. With that said, the improvement science approach attempts to...

  • Thanks for your question, Romid. What do others think? There have been comments in different threads about the possibility that there are data points within problems of practice that may not be able to be measured to inform changes in educational programs and policy. Is this an area that can be addressed or changed using data and what would that look like?

  • This is an interesting question and within the larger consideration of how other policy changes can continue impact the content and evolution of data privacy laws.

  • Thanks very much for sharing this link!

  • Thanks very much for sharing this link!

  • Thanks very much for sharing this link!

  • Thanks very much for sharing this link!

  • Thanks very much for sharing this link!

  • Amanda Lester replied to [Learner left FutureLearn]

    Thank you for sharing these links in your post!

  • This is an excellent post about needs and concerns related to data safety and data source privacy. Similar issues/concerns came up in comments posted last week and will be explored in some of the next steps in this week's portion of the course.

  • Amanda Lester replied to [Learner left FutureLearn]

    This is a great example of the challenges that result when trying to analyze data from a new model or system or yet-to-be proven system or program.

  • Thanks, Priscilla! We're excited to learn that the course has already been helpful in your work! :)

  • Thank you very much for sharing a brief overview of the context in which you are working, your hypotheses and plans for investigations, and some of the challenges you have already identified as you undertake this work. I look forward to continuing to learn from and with you this week and to expanding the discussion as we explore more detail about collecting...

  • This is a tricky challenge- without consistency within the comparative context, it is difficult to identify reliable trends. Using data for improvement when policies driven by larger systems that influence your choice of data points are also working to improve their efficacy can cause unintended disequilibrium within the processes your are implementing.

  • What types of data could/should be collected and analyzed to provide insight into ways that educators could improve learning experiences to help students gain greater proficiency as innovative and creative problem-solvers and learners?

    How can data for improvement be focused toward the assessment of educational processes that help students to gain capacity...

  • Hi Athene, Thank you very much for your feedback, and I am very happy that you have learned new ideas through engaging in this course. What are you hoping to learn as the course progresses and what topics would help to make the course more meaningful to your purpose for engaging around this topic?

  • Myung and Kewar (2014) define a high-leverage problem of practice as: "...an issue that, if addressed, can disrupt status quo practices in an organization and render improvements throughout the system. This is a compelling problem area that, if solved, will propel the organization toward achieving its core mission."...

  • Excellent point about the often overlooked or dismissed value of qualitative data.

  • This is a very valuable point- data needs to be measured, represented and applied in ways that can lead to meaningful interpretation and application for action for all stakeholders.

  • Thanks, Gretchen. Your comment is a great example of how engaging students in the collection of data about their experiences as learners can not only help the instructor to improve their practice, but can also help the students to be self-reflective about their needs as learners. This connects very nicely to earlier comments made by participants at the...

  • These are some great examples for engaging stakeholders in ways that also acknowledge there needs and interests. The learning walk, for example, enables colleagues to learn from each other by investigating aspects of each other's practice that as aligned to interest or mutual benefit. You note about the non-threatening nature of this type of communication is...

  • Your observation that employees at a Fortune 500 company may have different reasons (motivating factors) for staying in their position than an employee in the MCPS district may have raises an important question-- when as comparisons appropriate and what commonalities need to exist between programs in order for the comparative data points to be realistic and...

  • I agree with your comment about the importance of engaging stakeholders from a mutually beneficial vantage point. As noted in other comments in this feed, when stakeholders do not feel committed to the process, they may not provide the type of high-quality or consistent input necessary or may have a different vision which could then derail what was perceived...

  • Hi Bret- you raise a very interesting point. What happens to the idea of using data for improvement when stakeholders may have different visions or intentions about how and what data should be used, as well as what "improvement" means? How would that impact schools and universities seeking to review or redesign programs and curriculum, or to reform/renew local...

  • You raise excellent points that are central to using data for improvement. Ensuring validity and relevance in data collection and analysis is critical to avoid the influence of subjective analysis on decsionmaking. Deciding what should/can be measured, and then determining which data points are relevant and when, are critical aspects of effectively drawing...

  • This is an excellent point. Data privacy is a big issue/concern and a key point in policy discussions about developing educational data systems. I look forward to learning more about what you think about this issue as the course moves forward.

  • Thanks, Clinton. This is an important question and one that has been raised by a few other participants. As you read through other participant's comments, I hope you will jump into discussions about how data collection and analysis can be used to help students to learn more about their own academic behaviors in order to grow and become more effective as...

  • Thanks for raising this point, Daniela! Improvement science is a "learn by doing" process, and NIC participants often express having developed a deeper understanding of the problem not only in general, but also within the specific local context that they need to address, and also share that they have professionally benefitted in advancing their own thinking...

  • Thanks, Ruth- I agree that sharing examples does help to better link the theory to practice. The following link provides access to video from keynote speeches given at the 2017 Carnegie Summit on Improvement in Education, which includes some specific examples of these Principles of Improvement applied in practice:...

  • Agreed- when the climate for improvement is not open to learning from failure, it makes it challenging to move beyond a focus on measurement outcomes as indicators that guide action as discreetly focused responses rather than opportunities for more meaningful improvements to be pursued.

  • Thanks so much, Pippa- I totally agree with your analysis and can also related to what you describe in terms of the sidebar benefit that accepting error as a learning tool would offer to personal well-being. I would argue that these are important considerations when considering pathways to improvement, for all involved, because the process of improvement is...

  • Thanks, Hassan!

  • I agree-- in your experience though, what makes improvement processes more engaging and vision-oriented?

  • This is a critical point. Knowing what is 'right' to measure is a challenge, which is why leaving room for failure as a way of learning more about which aspects of the problem can actually serve as the greatest levers for change is a critical part of the improvement science process. This is particularly true when individuals within a NIC bring their own local...

  • Amanda Lester replied to [Learner left FutureLearn]

    This is a great point! Assumptions about the origin of shared goals can be a big challenge to successfully engaging a networked approach to improvement. While a challenge though, bringing these different views forward in a structured process can also help to work through the range of possible pathways to achieving a shared goal by highlighting common...

  • Totally agree! Please see my response to Pippa S's comment. We know that a lot can be learned about what to do by learning about what not to do, but this is often a step that is overlooked or avoided in the rush to identify and employ improved practice. As a result, many reforms and improvements are built on assumptions that aren't tested for failure until...

  • Your question about acceptance of failure is a good one. This is often the most difficult of the six principles for NIC participants to reconcile, because the external pressures faced regularly by practitioners are often so 'solutions-driven' that the need to learn from 'failure' is not readily accepted as part of the process of learning how to identify...

  • Thanks for also noting that factors outside of the data being analyzed as an influential presence. The influence that factors outside of the data being used to develop plans for action or policy have on the validity of those decisions can be a sticking point when assuming the role of evidence. How can this challenge be addressed in practice? What can make this...

  • Jan- thank you very much for sharing more about your course and for your suggestions

  • This is a great question, particularly given the many comments in the previous step about how time can influence authentic analysis of data in favor of meeting time-based goals. The six-steps of the improvement science process might be helpful in considering how to coordinate feedback and analysis of what the data means.

  • This is a fantastic discussion thread! Below is a recap if some key points that jumped out as I read through the feed. Please also share your impressions of comments that resonated with you.
    •‘Closing the loop’ can take on different meanings/implications based on the structure and culture of the organizations in which individuals work, as well as how those...

  • Thanks, Jay. This is an aspect that will be explored further through the discussion on improvement science and networked communities. Transferability in this sense refers to the ability to examine and analyze data from multiple sources and across a range of local settings/contexts to identify trends or theories of change that can be adopted in ways that allow...

  • Thank you to everyone for providing an example of a visual representations of an improvement process and for sharing your reasons for selecting that example. Many different types of models were shared, and the culmination of your responses provides a valuable repository of approaches to using data to improve practice.

    The array of examples touch upon a...

  • Fishbone diagrams are often used by Networked Improvement Communities, especially during the early stages of initiating the improvement science process. The basic aspects of improvement science and the NIC process will be discussed in future steps during this week's phase of the course.

  • I agree that the term "act" can seem ambiguous in this model without further explanation of how and where action should occur at this step. If the diagram is used alone to describe a process of using data to inform improvement, what term might provide a clearer meaning in this context, or would amendments/additions to the model be a more approachable way to...

  • Hi Daniela, These are excellent questions! Others have expressed similar questions or touched on these topics through their early posts, and as the course progresses, I look forward to digging deeper into these questions as a learning community.

  • Hi Rose, thanks for commenting on the topic of data analysis and error. The segment on improvement science will examine your point more deeply by delving further into the role that error plays in addressing, implementing, and communicating solutions related to problems of practice.

  • Thanks for raising this point, Margaret. The topic of data collection and identifying critical data has also been raised through other comments in this feed. How can someone determine whether or not they are collecting the right data, and what might cause an error in selection?

  • Thank you for reintroducing yourself and for connecting with others who are new to the course!

  • Thanks, Fatima! Glad it was of help.

  • Thanks, Judith and Fatima- I agree as well. Also, when there is a profusion of data, being able to focus on and identify critical levers for change becomes murky. The material about Improvement Science will approach that dilemma further.

  • Welcome and thank you for taking a moment to share a bit about yourself and connect with your fellow learners! I am Amanda Lester from AACTE in Washington, DC, USA, and I will be serving as your mentor throughout the next three weeks of this course.
    As I read your posts, I was excited to see the range of participants from around the world that will be...

  • This is a fantastic question. It will be interesting to learn if others in the course work in schools/contexts that use data in the way you describe, or if they themselves have experience using data to identify students who are at-risk.

  • Thank you very much for joining the course, as well as for your posts and sharing your first thoughts about how data could be used to inform and communicate improvements for the future.

    Many of you noted that data analysis is critical for determining how to proceed with a decision or process, and that data collection can be a source for identifying trends...

  • This is a very interesting discussion point. What role does prediction play in analyzing data and developing hypotheses as a frame for identifying possible trends or critical data points?

  • What do you see as possible challenges to identifying components that are critical to measure in order to advance improvements, and what are some ways of identifying critical levers for change that you have found to be successful?

  • This is an excellent question and is key to understanding the implications of context on data collection.

  • The tranferrablity of data across contexts is a critical consideration when using data to inform changes in practice, or even more improtantly, changes in how programs and systems are designed in response to data. This is a topic that will be explored in more depth later in the course.

  • What do you see as a possible implication of collecting too much data?