Friday, June 7, 2024

Season 3, Episode 11:

The Human Element: The Evolution of Public Health Research and Evaluation

 In this episode, we’re talking with Dr. Mary Davis about the importance of the human element in the evolution of public health research and evaluation.

Season 3, Episode 11: The Human Element: The Evolution of Public Health Research and Evaluation

by Dr. Joyee Washington and Dr. Mary Davis

Introduction

Welcome to the Public Health Joy Podcast — the safe space for real and honest conversations about what it takes to transform public health research into life-changing solutions for our communities. 

I’m your host, Dr. Joyee, a public health researcher, PhD survivor, and entrepreneur. In today’s episode, we’re talking with Dr. Mary Davis about the importance of the human element in the evolution of public health research and evaluation.

This is the joy ride you’ve been waiting for. Join us as we revolutionize public health through research done … with … for … and BY our communities. Together, let’s create our Public Health Joy!

Notes

Today on The Public Health Joy Podcast, we speak to Dr. Mary Davis, Lead Evaluation Specialist at Emory Centers for Public Health Training and Technical Assistance, about the importance of the human element in the evolution of public health research and evaluation.

Dr. Davis has extensive experience with all aspects of public health applied research and evaluation and has provided evaluation capacity-building assistance to a variety of organizations. In this episode, she joins us to reflect on how the mindset and language in applied research, evaluation, and technical assistance training have changed since she joined the field and why people are no longer viewed simply as data points.

Tuning in, you’ll learn why the “human element” is as valid as quantitative data, why relationships are an essential part of research, and how to demonstrate the importance of these things to relevant decision-makers, particularly when it comes to funding. We also touch on the critical role of diversity, Dr. Davis’ advice for the next generation of researchers and evaluators, and much more!

To connect with Dr. Mary Davis:

Dr. Mary Davis on LinkedIn

Dr. Mary Davis Email

Links mentioned in this episode: 

Emory Centers for Public Health Training and Technical Assistance

WE Public Health

Colorado Collaboratory on Equitable Evaluation

For more information on transforming public health research into positive community impact, visit https://joyeewashington.com

Key Points

  • The evolution Dr. Davis has witnessed in public health research and evaluation. [02:31]
  • Reasons that the human element is just as important as quantitative data. [04:29]
  • The critical role that relationships play in public health research. [08:42]
  • Keys to ensuring that funding is equitable and beneficial for communities. [11:17]
  • How to determine whether an evaluation is relevant to the end user. [15:41]
  • Creative and innovative ways to demonstrate the importance of relationship building to decision-makers. [18:55]
  • Training the next generation of researchers and evaluators through an equity lens. [21:40]
  • Why diversity is important for public health faculty and students, research groups, and the communities they serve. [26:18]
  • Dr. Davis’ advice for emerging evaluators looking to make a meaningful impact. [32:35]
  • How mentoring the next generation of public health professionals brings her joy. [35:08]

How to Rate and Leave a Review:

If you enjoyed this episode, please subscribe, rate and, leave a review! Just follow the steps below:

To rate and leave a review on Apple Podcasts:

  1. Visit the show page at https://podcasts.apple.com/us/podcast/public-health-joy/id1646078479
  2. Hit the “+” sign in the top right hand corner to follow the show.
  3. Scroll down to the “Ratings & Reviews” section.
  4. Tap the 5th star to rate.
  5. Hit “write a review” to share your thoughts and click send to submit!

To rate and leave a review on Spotify:

  1. Visit the show page at https://open.spotify.com/show/6ghSN8rXDMefH69AyxFe3e?si=0weLrbwWQNCDSWdq50bOJw
  2. Press the “follow” button right under the image and click the bell icon.
  3. Tap the rating immediately underneath the podcast’s description or tap the More (…) icon, then choose Rate show.
  4. Drag your finger or tap on one of the five stars to leave a rating.
  5. After you listen to an episode, answer the Q&A: What did you think about this episode?

TRANSCRIPT:

[INTERVIEW]

[0:00:56] JW: Welcome to another episode of The Public Health Joy Podcast. Today, we have with us Dr. Mary Davis, who has extensive experience with all aspects of public health applied research and evaluation and has provided evaluation capacity-building assistance to a variety of organizations. Her expertise includes community health improvement processes, integrated behavioral health, health department accreditation, public health preparedness, quality improvement, and partnerships. Dr. Davis and I actually know each other because we are both consultants with WE Public Health, which is a public health collective consisting of 35 consultants and growing across the country who are talented evaluators, trainers, strategists thought partners, facilitators, researchers, community advocates, and creatives. You name it. We’ve got it, for sure.

 

So, at WE Public Health, we offer a wide range of quality consulting services, such as technical assistance, training, facilitation, project design and management, evaluation, and all the things. But most importantly, we share a common vision for justice and equity. We actually partner with our clients to walk hand in hand, bringing people-first, compassion-centered, and evidence-based designed to amplify public health work. You can learn more about WE Public Health at wepublichealth.com. But we’re going to go ahead and jump into it. Dr. Mary Davis, please tell the people who you are, what you do, and what you’ve got going on?

 

[0:02:40] MD: Thanks so much, Joyee. It’s a pleasure to be with you today and to be part of this podcast. There are so many ways I could introduce myself, based on my experience. One of the things I thought about as I was preparing for this is the 30 years I’ve had in the public health field, moving in this journey from being a public health student, where I understood about how to study health behavior, where we studied people as things to be improved, to now, I’m working with and for communities to achieve their goals. While the journey is not complete, there’s been a lot of progress in the mindset and language of how we do applied research, evaluation, and technical assistance training in the last 10 years. 

 

So, I’ve seen this big evolution in the field, and part of that has been through the benefit of working with a variety of organizations, federal, and state and local agencies, non-profit organizations, foundations. I don’t want to – the phrase, been there, done that, kind of comes to mind. But I have seen a lot and I’ve seen a lot of evolution in my time working in public health.

 

[0:03:55] JW: You bring up something very interesting and that is, we have traditionally looked at people as things. I think just that language and kind of having that realization, and I was having a conversation with someone, I believe it was a student. Well, a recent graduate. I say, we have often looked at people as data points. They are the dots on the graph, right? What we don’t see or don’t hear are those people’s stories and their lived experiences. Part of my question, especially in just my growth over my professional career as a public health researcher, is just thinking about how did this evolution occur? How do we get to this point and where did we start? What does that change look like over time and figuring out where are we going? What does it look like for other evaluators and researchers as they start to come up and how will this change how we do things overall in public health?

 

[0:05:01] MD: I think that’s a great question and I think part of the genesis of this when I started as a public health student, qualitative data was something you didn’t touch. Like, “Oh, that’s that touchy-feely stuff.” It’s not hard evidence. It’s not sound enough, robust enough. That recognition, that data points, quantitative data points really doesn’t provide the richness that we need. I see quantitative data as the who, and the what, and the when, and the qualitative data is the how and why. When we started to say, “Oh, we need to understand the how and the why.” Okay, qualitative methods, they’re robust, they’re rigorous. Let’s bring them in. It was still data points. People as data points.

 

What I’ve seen recently is not only just stories of change, but stories as really providing a human element. Part of it is some funders, particularly foundations, want that human element, and for people in communities, they want that human element too. They’re like, “Hey, wait a minute. Yes, we need strong data. But talk to me as if I’m a person. You talk to me – I am a human being and you need to recognize that. To do, that we need real stories, real understanding of the how and the why, of whether a program works or not, or what people’s needs are, in their own communities.

 

Really, our job, as I see it is, are we in service to the data points in our careers? Are we in service to people in communities? And how are our skills best used? Is it to further our careers? Because we got the best methods and we can publish so many papers? Or is it to say my skills are in service to others, and their communities to address their community-identified aspirations? That shift, it’s a lot of shifting in 30 years, and in the last 10 years, it’s not only the how and the why, but who is this in service to?

 

[0:07:14] JW: That’s such an important question for all of us as researchers and professionals and evaluators to consistently ask ourselves that type of question. Who are we in service to? I was giving a presentation recently and I asked the question to the audience, what do you think of when you hear the word research? What typically comes to mind? Of course, you hear people say, survey, data, data point, graphs, interviews, focus groups. Those are the types of things that come up. So, I said, those are all true. But I said, if you don’t take anything else from this talk that I’m going to give to you today on community engagement or research, I want you to remember that the primary word we need to start associating with research is relationship. That is the word that needs to start coming up and being associated with research.

 

Because like you said, that human element is what’s going to take us to the solution. Research itself is a method. It’s a tool. It is a process. It is only going to take us so far. But when you put research, or evaluation, or assessment, or whatever it is the thing that you’re doing, you start putting it in the context of how do we do this in the context of building healthy relationships. That’s going to be the part that’s going to take us to the moon. We can go beyond our imagination and beyond any possibilities that we can conceive when we start thinking about research and evaluation in terms of our relationships and looking at people as people. Our community members have a whole life outside of the survey, outside of this focus on –

 

[0:09:05] MD: Actually, I’d hope so. It’s also in those relationships where we really understand what’s important and relevant and meaningful to the people in that community. Because we can come in and think about research questions. If we don’t engage community members, we can go out and do our survey, and our research, and find our findings. But if it doesn’t help them with an issue that’s important to them, then we’re wasting resources and people’s time. It’s one of the things that I focused on is how relevant are these evaluation questions or these research questions to the people in a program, in a community, at a foundation? Are we asking the right questions? And are we asking them in the right way in language that’s understandable to community members?

 

If we’re not, then yes, maybe we can get it published. But if it’s not relevant, again, you’re wasting resources and not building that relationship. So, you’ll have challenges with it, again.

 

[0:10:12] JW: In the past, and historically, we have looked at relevance as how is it relevant to the researcher, or that professional, or that evaluator? Is relevant to the grant deliverables. Is relevant to what’s being asked from the funder. But we have to start thinking about, like you said, how is this relevant to the people who are being directly impacted by this work? When we’re able to shift that perspective, then we can start moving in a direction that’s positive, and in a direction that engages the community in the process, right? That’s through every step of the process, not just when they get the survey, or when they come to the focus group. But from the very point of even starting with how do we determine funding requirements? How do we make sure that our funding is equitable, so that down the line as each decision is made, that anything that’s determined is able to be equitable for the community as well.

 

So, from your experience, what is it like to work with funders to examine the effectiveness of their funding approaches and improve that grant-making process? Because that’s a huge part of evaluation and research and impacting communities.

 

[0:11:32] MD: Precisely. I have had the opportunity to work with funders on their portfolios to look at, is this portfolio of funding doing what it’s intended to do? Is it going to really meet the goals of that actual portfolio? For example, I looked at funding around community benefit that hospitals have to – they have to do a community health needs assessment, work with their partners, work with their community. And are the approaches that foundation was taking to encourage hospitals to do not just lip service to community benefit requirements of the IRS. But are they actually engaging with communities to make meaningful community change?

 

So, that’s one example of that work. As part of my life in Colorado, I’ve had the opportunity to participate in the Colorado Equitable Evaluation Collaboratory. The Colorado Health Foundation and the Colorado Trust have co-founded this, and I’ve been part of this for good four or five years. We meet as evaluators. We’ve met a couple of times in person when we can. We’ve had some virtual gatherings as well, and they brought in speakers to help us think about how do we even look at data? When we look at data points, we see them a certain way. The metrics we see as a certain way. But community members may say, “Well, no, I see it differently.” So, we even get down to the basics of how do we even look at data? We have very rich conversations about what does it mean to do evaluation and research from an equity framework?

 

Part of these conversations with funders are around, when you look at funding, and when you look at data, and when you look at evaluation, from whose perspective are you looking at this? Is it the foundation’s? Is it the funder’s perspective? Or is it the community’s perspective or the evaluator’s perspective? So, it’s getting to some of those very, very basic questions about whose perspective are you looking at? Other pieces of this conversation are really around the requirements that foundations have for community organizations to report back to foundations on how effective a particular program is, and how do you define effectiveness? What is the burden of data collection on the organization and the community? And what counts as an effective measurement?

 

So, we’ve really gotten to some of those very, very basic questions about what are we even doing? What’s the perspective? What are our metrics for success? And how much burden do you need to take place on communities and organizations to report and demonstrate that effectiveness?

 

[0:14:26] JW: I think another point that is important to bring up is, there’s a difference – A lot of times we talk about evaluation and research in the same bubble. For those of us who have been in a professional a while, of course, we know the difference between evaluation and research, but everybody might not know. So, when it comes to evaluation, we’re looking at that data to make specific improvements, right? Look at the quality of something. When we’re talking about the funders, when we’re talking about different programs, we are looking for that data to change something, to make an improvement to what’s already there, to see what is lacking, to see what we need to be doing differently. When we start thinking about that perspective it’s like, well, how are you going to make an improvement for something that’s supposed to benefit the community, but you’re not talking about it with the community, or you’re not looking at it from the community’s perspective?

 

[0:15:27] MD: Exactly. I agree with you, 100%. You need to, in our practice, my practice, really engage the end user from the beginning. Again, it goes back to that relevance piece. We talked about whether research is relevant to a community. You also need to look at whether an evaluation is relevant to the end user, and that end user can change quite a bit. I’ve had end-users where it is our community members, all the way to funders. The funders, the end user of the evaluation, and what’s really appropriate and meaningful and relevant varies so much, depending on who that end user is, but you do have to engage them from the beginning and throughout the process.

 

In my career, I’ve had colleagues when I’ve done internal evaluation for programs, I’ve had colleagues who like, “Oh, you’re the evaluator. You just go do that and give us the report and we’ll be done.” I’m like, “Well, yes. Okay, we can do that.” And, yes, certain federal agencies say, “Just give us the number counts. That’s all we want.” But I really emphasize, I don’t want to be in this for me. This is not for me or for the federal agency. Even if we have to do those counts and those requirements, what else can we be doing that can help you do this better and help us really understand what’s effective, and what’s not effective? Are we spending our resources wisely?

 

So, I like to engage people throughout the entire process so that when it comes time to look at results, look at the data, something’s going to be done with it. If they’re engaged throughout the entire process, they’re going to do something with that information and move the needle somewhere on to the next step. One of the things I’ve done a lot recently, is really look at how we report our data, and a lot of it, if I don’t have to do what we call that 50-page report that just sits on the shelf. If we don’t even have to go there, that’s even better. Tell me, is it a presentation? Is it a conversation? Is it a community meeting? How do we want to talk about these results together so that you’re going to do something with them? That’s where the change and improvement and progress can really happen.

 

[0:17:49] JW: We can be innovative. I think that’s the thing. When it comes to evaluation and research, we can be innovative about how we do it. It doesn’t have to always look like a fancy report, or a journal article, or a conference presentation. We have to start getting outside of the box and thinking about what’s going to make sense for our communities. Our communities might not be at the conferences. They’re not going to be reading the journal articles. They’re not going to be reading those fancy reports that you submit to your funder.

 

So, we have to start thinking about what are some of the ways we can convey the message to our community members? And what does that look like in terms of solutions and impact, because it’s hard to measure relationships. Funders don’t really have these requirements. You need to strengthen relationships in these areas and we need to see the percentage of growth in increase in your relationships. That’s not something that we see. I’m not saying it’s not out there. I’m just saying it’s not something that we traditionally see.

 

When we start thinking about what are some of those innovative and creative ways that we can start showcasing the importance of relationship building throughout this process, and start thinking about what are some ways that we can measure or show the impact of what these relationships look like? I think a big part of that is from the stories and the lived experiences, and it’s like how do we – and I struggle with this myself as a researcher and evaluator, and I’ve seen people do it different ways. But how do we get this message across to the people who are the decision-makers?

 

[0:19:34] MD: That’s, again, where you can use that the variety of reporting methods, and we’ve actually been doing workshops around this with evaluators, and thinking through that engagement. Are you out there – are you just giving that one presentation of results? Or are you working with your clients to say, “Okay. Who are your various audiences?” We even talk about doing a communications matrix where you define your audiences, what kinds of communications methods to receive results are going to be really effective for your audiences? You’ve got to tailor and segment your reporting to different audiences and prioritize and maybe work with them to get out in community, or to talk with decision-makers about, this is what we’re seeing. Where are we going to take these results?

 

Michael Quinn Patton talks about anticipating to work with your clients for three months after you’ve finished with your results. Work with them for three months to think about, where do we take these? Who needs to hear about this? What is the best way to convey this message to those audiences? There’s lots of techniques you can use to really push the results out there. In evaluation practice, it’s really critical to make sure the results are used and they don’t just sit on the shelf.

 

[0:21:07] JW: We’re going to have to start thinking about how do we train the next generation to think with that mindset from the beginning, because I think a lot of us now who are who are older, I’ll say, older in the public health game. We were talking about research and evaluation from a very black and white, there are no real gray areas. Your P value has to be this particular number to be significant is really nothing outside of that. It was very straight lined. This is how you do it.

 

So, when we start thinking about how do we train up the next generation of researchers and evaluators to think more creatively and innovatively about how do we approach research and evaluation with an equity lens? With an equity lens and being focused on how do we prioritize our community’s needs, and wants, and strengths, and capacities, and all the things? It’s like, what does that look like? How do we start making that shift in that change for the next generation?

 

[0:22:17] MD: I think that’s a really critical question and it really depends on who the students are in the room. Because one of the things that I’m hoping for is that who the students are, and what they look like? They don’t look like me. I mean, I’ll just put it out there. They don’t look like a white woman from a nice background. They need to be really coming from community, and the creativity, and the innovation, and the push for change can come from having different kinds of students in the classroom, and giving different kinds of students the opportunity to enter those pathways into public health.

 

It’s also having students from different backgrounds. I was a psychology major, and I did health behavior, and then behavioral sciences and health promotion. We need more students from communications backgrounds, marketing backgrounds, and backgrounds that embrace technology.

 

I was just, this morning, in my inbox, there was something about using artificial intelligence to help you fill out an RFP. So, we need to be ready with our abilities to use all the latest and greatest technology and that will help with the innovation in reporting. I see a lot of innovation and reporting using different methods like, things like slide backs, podcasts are a way of reporting results. So, I think a lot of that innovation is there and it’s allowing students and professionals the opportunity to use it. If funders can allow for different ways of reporting, that will help drive that innovation.

 

[0:24:04] JW: To that point, and talking about the diversity of the students and understanding the diversity of the backgrounds as well, we got to have diversity in faculty. Because I know in my experience, I could probably count on one hand how many professors I’ve had who have looked like me.

 

It truly makes a difference when you are able to see people who look like you, people who come from your community, who understand your community, and being able to hear how are they making a change? How are they making the shift? How are they coming up with these creative and innovative solutions and what that looks like? Because I will say my perspective on research and evaluation, it definitely changed, because I came from the hard sciences. So, I was in biomedical sciences, physiology and biophysics. Nothing but white males, for the most part. A couple of white females, a couple, but mostly white males and that was a somewhat traumatic experience for me.

 

I have another whole podcast episode on it. Check it out. But it’s so different when I switched over to public health. I still didn’t have as many Black faculty members. But there were more diverse and I was able to engage with the community a little bit more, and those diverse faculty members could share their stories and their experiences. Then, I had another experience where I was selected to participate in what’s called the Community Health Leadership Program through the Satcher Health Leadership Institute through the Morehouse School of Medicine. When I went to that program, and I walked the halls of Morehouse School of Medicine, and I was with my cohort or my group, that was the first time I had really spent time around Black public health professionals, and walking down the halls and seeing these pictures of Black people lining the hall. I was like it was, “I didn’t know. I had not been exposed to that.

 

So, being able to have the diverse faculty and the diverse backgrounds truly makes a difference in how we’re training up the next generation of researchers and evaluators because it gives us hope. It gives us hope. It gives us a light. It shows us that we can be in these positions and we can be in a place to protect our communities.

 

[0:26:46] MD: That’s exactly true. I go back to when I was at Hopkins as a student, and Thomas LaVeist was a faculty at Hopkins in the Department of Health Policy. He’s a Black man and he would talk about being in the elevators at Hopkins and people kind of move into one side away from him, even though the man dressed fine. The man was well-dressed all the time. He looked very impeccable. He is now the dean of the School of Public Health at Tulane. But he had that experience in the nineties in Baltimore, where to be quite honest, it is a wild fortress. The last time I was there, I haven’t been there in a while. But the last time I was there, it felt like you were being buzzed into a fortress.

 

So, it was a very difficult situation. I’ll also talk about an experience I’ve had as an evaluator in Colorado, working with refugee communities in the eastern part of the state where there are a lot of meatpacking plants. I worked with faculty at Colorado State University and I wanted to go into the community just to do that, what they call a windshield tour, so at least I could understand the place. The faculty I was working with, in the department of Ethnic Studies, Dr. Eric Ishiwata, he was very clear with me. He said, “I don’t think it’s appropriate for you to sit in on many of the sessions with our community members.” Even though I speak Spanish and one of the communities is Latino, and the other was ethnic Somali. He said, “You might sit in once or twice, but I don’t think it’s appropriate for you to be the one collecting the data, and really talking with the community members.”

 

At first, that was really hard for me to hear. I’ll just be honest about that. But he had a really important point. And we found different ways to really collect data and understand what was going on in the community. But I had to realize, maybe I am not the right face, to be in the community. Know what, that’s okay. I had been there. I’ve collected enough data. I don’t need to be that person. But people who look like me need to be open to that. There’s this expression that I’ve heard of lead, follow, or get out of the way. And we need to understand that we have led for many years and now it is time for us to follow or get out of the way. It’s okay. We have skills that can be used in other ways.

 

[0:29:19] JW: That means that we have to leave some room for some tough conversations.

 

[0:29:23] MD: Exactly.

 

[0:29:25] JD: That’s what we often shy away from. We don’t want to face those hard truths and that’s natural. I think that’s just a natural human thing. We don’t want to feel the hurt. We don’t want to feel uncomfortable. But when we start having those hard conversations and when we start getting uncomfortable, that’s where the growth happens. We have to start learning that there’s something beyond that initial uncomfortableness and we have to start leaning into that unsettling feeling.

 

[0:30:00] MD: Exactly. I agree with you 100%.

 

[0:30:04] JW: Yes, and when I have a person who does not look like me, there’s no one who’s white or whatever. Sometimes I have to be the person to make that space uncomfortable and feel comfortable in making the space uncomfortable. But then I also have to say, regardless of how this person responds, I still have a duty. I still have a duty to let this person know that there’s a different way to doing things. We need to make sure that the community is protected. If that means that you’re not in the room, or you’re playing this particular role, or this position, or whatever, that that’s what we do. That other person also has to be willing to hold that safe space for me to be able to say that.

 

[0:30:52] MD: I think that, again, comes back to knowing your audience and the relationships we’ve been talking about. It’s about having the trust and the space. One project I worked on in the border region of Texas, you always get into the contract a little late, and there’s always deadlines, and there’s always things to hurry up and do. But we didn’t establish the relationship enough. Our partner said, “This feels very transactional. This feels like okay, you got to get this done. We got to get this done. But we’re missing the space in between.” We’re missing the relationship for you to really understand what we’re trying to achieve. 

 

That was to establish the relationship first, to be in relationship, not in transaction. Again, it goes back to having that relationship, where it’s meaningful with whomever your partner is, whether it’s a funder, a community, a colleague, a student, having a relationship allows you to say, can I go there with that person? Can I go there to say, “You know what, this isn’t working the way we needed to. Or we need to kind of pivot.” So, I agree with you. Establishing relationship. Being in relationship with people is really key to making sure that we get to the right place, not just the, that net numbers research publication place.

 

[0:32:21] JW: I think that’s a good transition for us. I do have to get to my signature question. But I want to ask one other question, which is, we talked a little bit about the next generation, and mentoring new evaluators. So, if you had to give like one piece of advice, like your top piece of advice, what would you offer to emerging evaluators in the public health field looking to make a meaningful impact in their communities, or in the field of public health research and evaluation? What would be the one thing that you would tell them?

 

[0:32:54] MD: I would say, Green and Kreuter were right. And Green and Kreuter the Procede-Proceed model back in the seventies. When I was a public health student in late eighties, early nineties, and they said, “You got to start with your community’s quality of life.” We were like, “We don’t get that, we’re here to talk about health.” Maybe their health issue isn’t a quality of life issue, or vice versa. But they were right, because for 20 years, we kind of ignored that or paid it lip service, and now we talk about social determinants of health.

 

For many communities we work with, quality of life is about social determinants of health. Really working on those root causes for communities, the quality of life issues, that is what is critical to them. So, really understanding what a community needs on organization, a funder needs, ultimately, it may not be about public health. It may be about something else. You need to really understand what that’s about, and how you can work on that first. If it’s related to health, great. It may not initially be related to health, but at the end of the day, it may help improve their overall quality of life and well-being.

 

[0:34:12] JW: That speaks to public health isn’t everything. We say that all the time, right? Public health isn’t everything. When we start thinking about what are those factors that bring the quality of life to our communities? In my mind, I think about what can I do? What are the decisions that I’m going to make, that’s going to bring my community joy? I want my community to have a quality of life that is full of joy and whatever joy means to them, not what it means to me, but what it means to them. So, we have to start thinking about what does that mean? What does that quality of life mean for our communities? How does it show up in how we look at research and evaluation as a process? And what does that mean to us as researchers and evaluators and public health professionals? What does bringing joy to the field of public health mean? So, in that, I get to ask you the question. What brings you joy in your work?

 

[0:35:12] MD: This time, it’s really focusing on training, mentoring that next generation of evaluators. I kind of talked about how I feel like I’ve been there, done that. I’ve done the evaluation work. I’ve done the research work. It’s time to make sure that the next generation has the skills, the capacity, the confidence, to focus on communities, what communities want, and help them understand how to use their data and analytic skills in service to others.

 

[0:35:43] JW: I love that. Also, if people want to get in touch with you, if they want to contact you, for more information or anything like that, how do they connect?

 

[0:35:54] MD: They can connect through my email. It’s mary.v.davis@emory.edu. That’s my most time gig right now is with Emory Centers for Public Health Training and Technical Assistance, as well as consulting with WE Public Health.

 

[0:36:13] JW: All right. Well, I thank you so much for an amazing conversation. This whole conversation just brings me joy. So, I could talk about this forever. But this is going to wrap up another episode of The Public Health Joy Podcast.

 

[OUTRO]

 

[0:36:31] JW: I am so grateful for this time we got to spend together. If you enjoyed this episode, I need you to subscribe, rate, and leave a review. For more information on transforming public health research into positive community impact, visit www.joyeewashington.com. This is where research meets relationship. I’ll see you next time on The Public Health Joy Podcast.

 

 

[END]

 © 2024 Joyee Washington Consulting, LLC. All Rights Reserved.

Rate & Leave a Review!

Like the podcast? Subscribe on Apple Podcasts or Spotify to rate and leave a review! We would love your feedback and thanks for listening to the Public Health Joy podcast!

Contact Us

publichealthjoy@joyeewashington.com