Computer generated transcript
Warning!
The following transcript was generated automatically from the content and has not been checked or corrected manually.
Eight. So I'm gonna share my screen and uh hopefully we have uh let's see, let's share this one. Yeah. And so hopefully we can have an interactive session as we teach. Can I ask what uh how much time do I have? Um The session is scheduled to end by 8 30. So we have the 8 36. There are questions that um let me go spend beyond the time. OK. So I will not go beyond 8 30. Um 1 of the things that I think is important is uh keeping to time. I think it should be something that we all practice as we, as we leave our lives. Um So I'm standing in for my very good friend Ibrahima who was supposed to take this Brian and I uh work together on stats clinic. Stats clinic is a, is an online teaching session for young researchers, public health researchers. We teach scientific research methods, epidemiology, biostatistics, uh which includes systematic reviews and meta analysis. So the name is here because that's who was supposed to take this class. Um So uh uh a little more about neal, like I said that really, I think that we have everything I graduated from the University of Lagos in 2008. Uh Some people find that hard to believe. It makes me want to dye my beard, uh salt and pepper so that people could believe my, my age when, when I meet with them or believe that I am as old as I seem to be uh in terms of what outcomes research means. You know, people have asked me even colleagues, what is an outcomes researcher? And uh and I'll tell you why I'm telling you this first and then I'll go ahead. I believe I'm speaking to medical students and um people in the like professions when you leave medicine, if you do not want to practice, it's good. If you have an idea of what's available out there, I did not start doing the work I was doing until three years ago and I've been a doctor since 2008, maybe four years ago. And at that time, I wa I did not even know that this rule kind of role existed. Um uh an outcomes researcher is, is very similar to a medical researcher, a clinical researcher, but an outcomes researcher looks at different ways a drug would benefit people. A good example would be uh we know that they are different nsaids in the market, ibuprofen, blah, blah, blah, and all of that. But if you ask people, especially women who suffer from dysmenorrhea, for example, like what NSAID. Do you take for your dysmenorrhea? You would see that some of them favor certain nsaids over the others that is value. So all of them are nsaids, all of them are approved for pain as pain medication. But what is the value of one over the other? And you can start looking at value from different areas? You can look at it from the patient's voice. The patient's perspective, patient says I like this medication but my stomach hurts. That's a negative value. I like this medication because it's quick to act. That's a positive value. I like this medication because I take only one pill every eight hours as opposed to two pills every six hours, that's positive value. And so outcomes research looks at all of the things that you can attribute in terms of value to indications. And you see this is why we should support the importing of diclofenac over ibuprofen and so on. OK. So that's a little introduction about me. And I think you're getting the style of the kind of way that I teach and kind of way I pass information across to people. Another thing is when I teach, I ask people questions. So I'm going to volunteer. She, I believe she is the one who read the recitation we've never met. So um but I see your name there. I'm going to suggest I'm going to volunteer. She uh Tala Tojo. I have a friend whose name is Tojo I'm going to call a couple of names. You will be not on the hot seat, but you would be my friends in this session. And if there are people who want to volunteer to be my to, to provide this lecture with me, please signify to the person who's handling the chat and they will unmute you because I'm going to be asking you questions because what you know and what you are able to express is what you are going to be able to build upon for further knowledge. Ok. So let's get right into it. Where is my wristwatch? Um Shan let me volunteer you also to, to give me a hint every 20 minutes. Just let me know. Oh, we are 20 minutes and that would help me know how well we're doing. Is, is that OK with you, Shawn? Yes, sir. OK, sweet. Um So let's go into the right. So the content we, we're going to learn about uh metanalysis systematic review. We're going to learn the nine steps in the meta analysis. Um And in addition to that academic aspect, we're also going to talk again, we're going to talk about the philosophical aspect. Shan Do you know that science emerged from philosophy? Yes, you're supposed to be unm muted because like I said, I, I don't know who 10 year life is, but you should un mute yourself too because this is a conversation between all three of us. Did you know that science emerged from philosophy. Yes, sir. Oh, great. OK, sweet. And so, and so when you talk about science, whether it's clinical research, you know, irrespective of what you talk about, you must have the philosophy, the thinking behind what you are doing. Uh Really, you must be ready to defend the thinking behind what you are doing. You are not a zombie as a fair, as said, OK. So let's jump right into the rationale. And um OK, why do we conduct systematic reviews or why do we conduct reviews? Look at this third point here, we want to publish highly cited research papers when uh when doctor be and I, excuse me, bye bye. We're developing this slight deck. Um III I said, let's, let's I argued for putting this third point here. Um But he thought it was very practical and so we put it as the third point. Um So, so, so just keep your mind on this. People want to conduct reviews because they believe it's a quick way to get a voice and get seen in the research world. And that is great because there is a published or perish kind of thing if you're in researcher, if you want to show yourself as a researcher. Uh But let's talk about the top two or the, the, the first two bullets that this is the academic reason or the practice reason why we do reviews. You want to look at what evidence is available. So you want to look at maybe 20 studies have been done to answer a research question. You want to look at all those 20 s studies and you want to have a summary or a synthesis. What are these 20 studies telling us? What are the patterns we are seeing? And how do we capture the, the summary or the totality of what these studies are say? So let me tell you something a little bit philosophical. The way you arrive at truth is one of the ways you are, you arrive at truth is through reliability and reliability means if you do a thing again and again, you get the same result. And so if 20 different scientists across the world do the same s do the same study, you have to think of those studies as be as the same study being repeated 20 times. They are not 20 different studies, they are the same study just repeated 20 times. And so when you do a systematic review, you are looking at whether those studies meet the truth criteria of reliability. And so you say, let us look at the results of these studies, if they show they could show they would point us the direction of our answer. And if they do, then you have more certainty, more confidence that you are arriving at truth. The second point is you want to identify evidence gaps. So of course, when you look at those 20 studies or you look at the 40 studies that have been conducted, you say, oh so and so did XYZ but they did not do ABC. And so you can create a research agenda and say, OK, for the next five years, we're going to focus on plugging the gaps ABC now. So those are reviews in general. Um Do you know Shawn, what's the difference between a review and a systematic review? I think the narrative review just um takes into consideration and several um several research for systematic reviews is um randomized control trials and analyses. The the results of the randomized control trials to arrive at the result as opposed to a narrative review, which is more like a summary of um several si so Nila, do you want to add to that or do you disagree with that? Do we have any volunteers or is it just, I think, I think um as well, narrative reviews are generally less um formal and less detailed and they use less detailed methods to analyze the um studies. OK. So let me build on, on what both of you have said. So in a review, a review can be very detailed. Um There's a, there's an article on breast cancer uh that was written by some doctor and published in Jama. It's such a beautiful review. I I would tell my friends um if you want to do a, if you want to get a paper published, take this journal take this article, whatever topic you have, write word for word as is this journal just change the words, you know, to your, to your own topic. You know, I of course you, you have to do your own research. I mean the structure of that article because of how comprehensive that that article is. But what a narrative review does is tells you a story. It's narrative, it's uh it's like breast cancer is caused by this. There are so many people with breast cancer, these are types of breast cancers, blah, blah, blah. It just narrates, it tells you this is the treatment for breast cancer. It gives you a narrative of what's going on. Whereas a systematic review answers a research question. For example, how long do patients with stage three breast cancer live despite the availability of innovative medicines or targeted therapies in the market? That is a research question and a systematic review answers that. So a systematic review. Whereas a narrative just talks about what it's talking about a topic, a systematic systematic review answers a question and systematic reviews do not focus on RCTS. They can focus on observational, on observational studies as well. Um And uh like you see here, it says that systematic reviews can include meta analysis which derive pool estimate. And another thing is systematic review does is it tells you about the quality of the evidence. So earlier we talked about you're looking at different studies that have been done and we think of these studies as repeats of themselves. But then these studies could be done with poorer quality or higher quality depending on, on, you know, on, on the capability of, of this, of the context. And so you don't want to read the story or the results. A high t studies telling you the same as what a low quality study is telling you. So a systematic review, number one answers the research questions. It looks at it summarizes or synthesizes evidence across the different studies are available and it uses scientific methodology. So we're going to get to that, you're going to see how the scientific methodology is is captured and because it pools results, you can apply a meta analysis to it. So at this point, I'm going to pause and see if there are any questions in the chat. OK? I see that someone has asked people to volunteer and raise their hands if you have any questions. Uh let me know, let's move to the, to the next slide. OK. So these are formal definitions, a comprehensive appraisal. It. I I, you know, you guys are medical students, so you're expected to understand or to know uh definitions. But um hopefully you're going to graduate. Um and then you're going to go into the real world and hopefully you would move from an academic perspective to a philosophical perspective and a practice perspective. So, but let's talk about the academic or the definitions. Uh You have a comprehensive appraisal. I'm already lost with those two words. What is comprehensive, what is appraisal? But let's go of all relevant research on a clearly formulated question based on systematic and explicit methods. So this just summarizes all I've been telling you. It's a, it's a summary of research. It's there's a que clear question and there are methods. It may or may not include methods to analyze when you analyze statistically the results from your systematic review. It becomes your meta-analysis. What is your definition of meta analysis? It is the application of statistical techniques to integrate. Oh my God. I know a big word. The the results of studies included in a systematic review. OK. Learn these definitions but you have to be able to explain them. That is what makes you a useful member of society. OK. So uh let's talk about for other reasons why you do a meta analysis or why you do, why you do a systematic review that eventually leads to a meta analysis. OK. So Canola, can you unmute yourself a and um because you're going to be the one answering questions, look at this table that I'm circulating over. What, what do you think this table tells you? So the awkward part is you have to have yourself un muted it, you know, sorry, I have um some mental issues. Yes. Um The the table is telling us about the intake of calcium on serum ferritin and looking at studies by various or we can see the studies there on the left side. OK. And she does show, do you want to add? No, sir. I don't have anything to answer. OK. OK. Good, good, good. OK. So we're going to see we're going to see these symbols here. Have different uh shall I say meanings or applications? So we're going to talk about that but, but um 10 is very right, different studies. So remember when I said that when you do a systematic review, you're looking at the different studies and you are in a sense, the assumption is that these studies are repeats of themselves because one of the things you're looking at is reliability. And so you look at all the studies and you say, well, what's the impact of calcium intake of calcium on serum Ferri tea? And we see that there's a zero here. This is a mean difference table meaning that um you're saying people who take this amount of calcium vessels, people who take maybe a low vessels, high amount of calcium, what is the difference in their serum ferritin amounts? And so we found that Augustina, Agustina in 2013 found that this number is 9.76 with a confidence interval of 7.5 to 12.0. And that's a considered a small confidence interval. And because his, his confidence interval shows us that he has high certainty that this value is the truth that you will. That is, this value indicates a truth, not the truth. A truth that if you take calcium ferritin, I'm assuming I haven't looked at this paper, if you take co uh if you take the higher, if you take a high dose of calcium ferritin of calcium, sorry, your serum ferritin will increase. But look at this guy, he in 2012 did probably the same study. But he found actually that if you take a high dose, as assume it's high vessels, a low dose of calcium, your serum ferritin actually reduces. But look at his confidence interval, his confidence interval is so wide that we're not confident that this number of 11.8 reduction is the truth because uh we'll talk about confidence intervals, I'm mindful of time and you can see on and on. You see, um Eli a also found a negative association with a very high level of confidence that this is the truth and on and on and on. You see, and by the time you look at when you pull these results into a metanalysis, when you pull all these truths. So and so found so and so fine, what you see is that it looks like it looks like calum intake is negatively associated with serum ferritin. But when you look at the confidence interval, you would conclude that we do not know or we are not sure whether this is the truth because the confidence interval, ab bridges or stra do zero. If we had time, if I would talk about, I would ask shall or 10, you allow someone to explain what it means when a confidence interval abridges zero for a mean difference or ab bridges, one for an odds or risk ratio. But basically when it does, you cannot have confidence in that result, you would see that you have to side, you have to fail to reject the no hypothesis. OK. Good. So, so that's just an explanation so that you don't look at the table and miss the point because this is the information. Um and, and, and to confirm what I just said, if you, if you're able to read a graph, you would see that as your calcium uh what you call it intake increases on the X axis. You can see that the mean difference, the mean difference. So this should be, I don't know, means I don't know mean difference in serum ferritin actually reduces. So it kind of confirms what we are saying here. So these are co cut, cut, cut, cut and paste from a cut and paste from us. A paper which I I haven't looked at. So um yeah, uh take that as you may. OK. So why else? So, so we've talked about the scientific question that was being answered on the, on the right. Let's look very quickly at the left. Why do we do meta-analysis to estimate average causal effects, which we have seen here to identify the sources of heterogeneity in causal effects. We'll talk about that. Uh As we go on to evaluate the degree of heterogeneity. OK. Great to have a poor prevalence. I like this in an incidence estimate to compare interventions that have not been evaluated in the same that trials, we'll get to this to pull primary data from. So we'll get to this as we go on. OK. So here's, here's the prevalence example. So uh we're looking at what is the prevalence of women who are overweight globally? OK. If someone has a question or someone has noise in the background. OK. So what is the prevalence of women who are overweight globally? And you can see that the researcher captured from different countries in Western Sub Saharan Africa actually is an African study, not global study, excuse me and looked at different countries and I'm going to assume this is from the Ihme data set. Does anyone know about the Ihme data set from the University of Washington Canola? Are you aware of the IHME, the work done by Ihme from the University of Washington. Shall. Yes, I've come across it before, but I'm not very familiar with it. So OK, good. So, so that can be a career launcher for you. Um And again, to prevent this awkward pause in since this is a recording, it's best if you leave yourselves un muted unless there's noise in your background I, I will wait for you. And so there would be that awkward pause. Um Now it's, it's a, it's, it's, it could be a career launcher because I'll tell you what they do. They basically look at the whole world and they take what is the burden of disease across all the countries in the world. You know, it's an excellent data set. It, it, I think you just type in Ihihme in Google. it will take you there and you, it, it's free and there's a lot of visualizations, you can pull out every year they publish in the lancet, the global burden of. So if you're thinking about your final year project, you know, um in med school, people would do KAP studies, knowledge, attitude and perspectives or whatever that in practices of women selling a when you go market. Oh God kill me already. Um If you want to do really, I mean, that's good. That's good. That's a good study. But yeah, you know, yeah. No, if you want a real good data set, you can look at the IHM data set and you'd be able to do really good work, publishable work because the University of Washington publishes, like I said, every year, the global burden of diseases and you can look at the trend from 20 from 2000 and maybe 2007 or earlier all the way to 2023. How different things have happened across the uh across time. And when I said it's a carer launch, I, after I learned about this, I was doing a fellowship uh with Novas. And they were, they are working as, as you probably know, they have about three sickle cell products and they wanted to understand what the global prevalence of sickle cell was in uh not global African prevalence. And you know, many of them have not heard about this data said I said, oh, I can find this information from you, give me a weekend. And so I was able to do the work and of course, it went on my resume. So um it, it's stuff that you can do that people will be easily rep impressed with. Uh it's a, it's a, it's a, it's an arrow in your, in your, in your quell that you can pull out to show your status as a upcoming researcher. OK. So let's talk about heterogeneity. So what is heterogeneity? There is a academic um definition and I really don't like definition. So you can check it out. But basically what's happening with heterogeneity is sometimes the studies are different, remember that we had said earlier that we assume that these studies are just the same studies repeated over time and repeated by different people. But that's not really true. As you can see, for example, and, and you can see, for example, these are a person does a meta analysis looking at the impact of N95 respirator or similar masks on the uh incident or the the occurrence of respiratory diseases. And what you can see is that they did it across regions. So there's Chinese studies, but there's also Saudi Arabia and they are across diseases. So this question is a little bit very broad kind of wide because someone can say the cultural practices for mask wearing or being in a in a space, social distancing may be different in China versus Saudi Arabia. And so there's heterogeneity by study. Also the outcomes are different. Two of them are SARS, one is COVID, one is MERS. And you'd say maybe maybe these, these diseases act differently in terms of how the N95 is able to impact them. So because of heterogeneity, you could argue that your result is not, you cannot have a high confidence in your result because of the heterogeneity. So look at the heterogeneity here. Um So this um statistically the definition is that this is a proportion of the variability across studies from one study to the other divided if I'm correct, divided by uh I believe the total variability of across studies and within studies. So within studies, there is also variability, it gives you the sense, it gives you the sense of whether the, the heterogeneity is due to the studies themselves being different or they are different due to random error. So again, I don't know what your g you guys background is in the basics of epidemiology and statistics. But let's just move from the understanding that if your studies are different, they will be very different, they will be, they will high heterogeneity and you may not have good confidence in your results. So this is 80%. And we are told that if you have heterogeneity of zero, between zero and 40% you are fine. But as it gets higher, you get concerned. So a heterogeneity of 87% I'll be like uh I don't really, yeah, and you know, I don't really trust that they are the same studies. So look at the second table now they are the same disease. They are the same geopolitical region. Let's pretend that the Chinese and Vietnamese have the same cultural practices, which of course is not true. But you can also see there are differences in healthcare settings versus non healthcare settings. And so when you see the results, you also have uh I squared heterogeneity of 76. And you might say, oh, I'm not very, I am not very, very um confident that there is no heterogeneity. You need the studies to be the same. How are we doing with time? Five minutes? OK. So you were supposed to tell me at 20. So OK, so let's let's so, so should we step it up or, or are we doing fine? I think we should step it up a little. So you're understanding that your studies have to be the same you don't just conduct a meta analysis and say, oh this is a me analysis. This is the answer your heterogeneity is going to tell you whether you've done, whether you're, you're looking at the same studies or not and whether we can have uh we can have confidence that you have a right at truth. OK. So someone might say, why do you talk about truth? This is research. Well, if you, if you, if you have research that does not tell you the truth, then you're doing a bad thing. And, and I mean that, and I mean that I told you that this is like a philosophical thing. You, you let's not go into that. But um there's internal validity, external validity. So your internal validity has to be good, but your external validity has to be good. You, it's a, it's almost a moral imperative for you as a researcher to arrive at truth. OK. Good. So in terms of another reason why you do meta analysis and this is advanced, this is what you see uh Doctor Abil and I work in the pharmaceutical industry. Oh Another thing is this when you ask, I was at a conference, I met an old uh schoolmate, you know, and he's a practicing gastroenterologist in the US. And I was like, oh my God, it's so good to see you. He was like, yeah, so good to see you. And he's like, so what do you do? I said, oh, I'm, I work in a pharmaceutical, I'm an outcome researcher and then we're talking and it's like, oh, so you are like, um, uh, a drug rep, a sales rep. I, and I was very stunned for a second because I thought that, um, that people, people would, by this level of people's, some people's careers, they would understand that. It's not only sales reps that work in the pharmaceutical industry but it seems to be a, a common thing. It's pharmaceutical industries. Uh pharmaceutical companies do a whole lot of scientific research. They don't just make the drugs in a black box and then give to people to sell them. So um and, and the reason why I'm saying this is in response to a point that someone made that can you talk about opportunities in public health? It, it, it is this, that there is a whole range of opportunities in pharmaceutical industry that call on your talents as physicians and as young researchers. So yes, I will talk about that. Um But, but, but here this is advanced but this comes, I would say because maybe because we work in the pharmaceutical industry and it's good for you to, to know what's going on. Um So most of the clinical trials that are done compared to placebo like is, is uh you know, is, is the vitamin C better than placebo, is uh this drug better than placebo, et cetera. But then in the market there are different drugs that are not compared to each other. And so you can perform a me analysis, say they didn't compare these drugs to each other. But let's use statistical methods to do that. And this table here shows you that you can have your, I've forgotten my clinical medicine. So, pardon me, your, your A CS, your I your uh A RBS. Uh You remember that C cal calcium channel blocker? I don't know what this is but good for them. And so all these drugs have been compared against placebo but not against one another. In meta analysis. I think it's called the, there's a ne it's called a network me analysis or you have a matched adjusted indirect comparison and you can use these scientific or statistical methods to compare the drugs with each other and, and lots of fascinating work goes on with this. Uh And, and, and last of all, of course, you can compare patients individual data. You can ask all those people that we saw here and say, send me your data. Although I think this is from Ihme, you say all these people send me your data. I'm going to combine all of them and do a meta analysis. They are going to tell you no, they are going to tell you no. So uh this is, this is rare but it does happen except maybe they are, they have been asked to deposit their data from the phone that is if I'm funding you, you need to put in your data into this repository so you can get the individual data otherwise they are going to tell you no. So you can tell them, you know what I know the structure of your data. I'm going to write code. I'm going to send it to you. Can you run it and send me back the summary? They'll say OK, I like you. Another, another thing now that you are students, you need to use your ability as students the day you graduate. No one cares about you. I promise you as a student, you, you can send an email to Bill Gits or whoever you say they Bill. I am a student. I'm a 400 level student. I am interested in XYZ. I would like to do XYZ and they will send it to you. But once you become a doctor, say, oh, I'm a doctor, it's like doctor. Ok, go good for you. Good bye God bless. And they wouldn't respond to. So take advantage of your being a student. I used to work with a high gihmo uh in Nigeria. And while I was doing my doctorate, I reached out to the CEO, I got his uh email. It was a new CEO. It was a bright guy. Very, very uh great guy. And I said, please, I I'd like to have your data for two years and he's like sure. And he sent me data for two years of claims data. Uh That's not going to happen now that, that I'm no longer a student. So feel free. Uh use your your opportunities as students. When my, when my mentor, my advisor told me this, I, I did not understand it but no one ever takes advice. So, uh but anyway, you've heard it here if you take it or not take it. OK, let's go into the ninth step. So let's pause here. Let's see. What are the questions are there? How do we measure the heterogeneity of the studies? Oh, the answer is, I have no idea. There is a formula. Uh When you, when you, when you're doing your statistical analysis, when you use your SARS, your r your Python, whatever you have, there will be codes to determine the heterogeneity and, and it will pre it will, it will, it will give you the value. Um uh but there is a statistical formula which you, I don't know which you, I mean, do you want to learn the formula? I don't think so because um your, your computer would just give you the value. Um So let's see any other question. So that is done. If you don't have any, I'm going to pause here for one minute and see if there are any questions in the chat or if anyone wants to admit themselves and ask a question. OK. So let's move on there. No, there are no questions and I see that someone did send me a message saying this is 20 minutes alert. Uh I actually asked that you tell me, you know, by a muting yourself but, but, and please do that at the next 20 minute. Um So that the question um the tables that you showed us at the beginning, this is, I mean, this is just for me, but this is the first time um I seen something like that. I was curious is that um a representation of what the Ihme data says would look like. No mm mm mm tables of them. No, this table, right. Yes, this one. No, no, no, no, no. I meant the data used to do this analysis I believe comes from the Ihme data. Uh but I may be wrong because I can see that there are dates here, right? So maybe I'm wrong but the im in fact, I think I'm wrong cause why do I think I'm right because it seems benign as opposed to see the the anyway, I, I think I'm I'm more wrong than I'm right. So it may not come from Ihme but the point about Ihme is noted and if we have time we could, I could show you um The Ihme I don't think that we're going to have time because I, I talk a lot, I talk too much actually, we've been doing so far. Um I'm sorry. Um I just wanted to uh you mentioned something about the really comprehensive and breast cancer. Yeah. Maybe along the line if you could. Um, it OK. I will, I will look for it. Um, once I stop sharing my screen so that you don't see my, my, my whatsapp chats and the things that I see, uh please raise your hands. Someone said it's subgroup analysis compulsory. That's a good question. It's subgroup analysis compulsory and the answer is yes or no because again, and, and, and this, this ties you to the, er, maybe it's not philosophical about the academic thing. Like what do you want to know? You don't just do AAA me because you want to, it's what, what, what, what do you want to know? Suppose you say does breast cancer and now you're going to take me down the rabbit hole. Um Shane. Do you know of Bradford's heels criteria? No, sir. I do not, sir. How about Tenny? So that's something else you should write down. Um So I'm plugging all these things that helps, that would help you become a better. Uh 10 Tenny. Do you know about Bradford Hughes um criteria? Um Let's see. So, so the point I was trying to make, do you, should you have subgroups? OK. So, so here's the thing and here's what I was talking about Brat Hill criteria, but Brat, he says he, he has like nine different criteria and here's the thing. Some people argue that Bradford, he did not come with the criteria that the criteria was came, was come up by or come up with by a woman, you know. So, so that's um uh that's another case of men stealing woman's glory uh since 18 84 or something. Um But one of the things he says is how do you arrive at causation? And the reason why I started by telling you that science came from philosophies is because almost all we do in scientific research, especially public health research is we are trying to arrive at causation and causation. The idea of causation comes from arriving at truth. How do you know truth? You can establish truth if you can say that A causes B, if A causes B that is a truth, A causes B it is true. And so philosopher started talking about causation. What is causation? If we, how can we say A causes B for you to arrive at truth? You must be clear on the strategy A causes B. So causation is A, is one of, of it, it's a really big or what a really big discussion point uh for philosophers. Now bra for he comes up by the time science has emerged fully from philosophy and says there are nine ways you can understand causation and one of those things is biological possibility. So, so for example, if you want to say, how do you know how tall your child would be? And somebody says, well, if the mother eats nail during her pregnancy, the child is going to be short as long as, oh, that's, that's that, that's BS, I'm sorry. M BS is Bachelors of science, right? So you guys have MB BS. So, so you say no, that's BS. We are MB BS. Yeah. So you say that's BS. Why? Because there is no biological possibility. What do we know about snails during pregnancy that's going to affect whether a child is tall or short. You cannot, if you don't have that biological linkage, you cannot arrive at cation, you cannot arrive at truth. Now to answer your question about subgroups analysis. A disease is a disease and people are people. So somebody will say, yeah, we want to look at whether breast cancer um this, for example, they say we want to look at whether milk consumption leads to breast cancer. And someone says, let's look at it in house of people, then look at it in your people, then look at this in adult people and someone says, wait, hold on a second. There is no biological possibility outside evil. Those are, those are, those are social constructs. They are geopolitical constructs. A human being is a human being irrespective of where they were born or to what tribe they were born. You don't care about that kind of subgroup. But somebody else might say, well, w how about we look at breast cancer in African race, people vessels Eurocentric or so called white people and then you say, oh, well, maybe there are different ways in which our genes manifest diseases. And so there's biological possibility there. So the answer to the question, a subgroups important in your meta-analysis. The answer is, it depends, is there a biological possibility that differentiates one subgroup from the other? If yes, then maybe you should, if there is none, then maybe you should not. Ok. That's a long winded answer to the question that was asked. But let's move on because of time. So we're going to end at 8 30 I'm going to keep to that. OK. So, oh, this is the boring part. So let's let's grind. So there are nine steps to doing the systematic review and meta analysis. I feel you Brian is still going to forgive me for saying this is boring part. But oh OK, there are 99 steps. So the first thing is you need to identify the need for a review. So let's so, so let's circle back something I've learned to be good at, right is to tie the things I tell you in the beginning to the things I say in the middle and things I say in the end. So you are wondering why I was talking about um some something I was talking about uh earlier but, but look at oh yes, about publication, you see identify the need for a review. What is the need for you want to do a metasis what's the need for this metanalysis systematic review? If you were a, if you are a reasonable professor and a student comes and says I want to do this because I want to publish, you're going to be like, really just because you want to publish, there's no need for your review if it's a need to publish. But we put it there because we understand that there are practical reasons why people should do a review because they need to show some of work as they progress in their careers. OK? But keeping it strictly academic, is there a need for a review? So somebody might say I'm interested in caterpillars. I want to know if caterpillar shake any it come shake any con can come and you say come on man, I know you're interested in caterpillars but is there a need for that? Maybe not, don't, don't do it, I mean, do it because you want to. But I mean, you know, but think about when you think about the work that you want to do, look at the problems in your community and say let's do this, let's try to understand research. Remember what I was saying about um an almost moral imperative research is for decision making, decision making is for population health, population health means that you improve the quality of life in terms of longevity and quality of the people that you serve. How do you have, how do you become a professor? With a body of work, a professor in public health, with a body of work that does not improve the health of the population. How i it's, it's absurd if any of you um in your external readings, you should read the work of Paul Lore uh who wrote them. What's this book, Pedagogy of the Oppressed? You see, it will give you a sense of why public health is. If you, if you read my bio, you would see that I talked about the use of public health as foundational principles for creating the world that we want. That you begin to understand why you do research. But I'm I'm I'm straying widely away from, from, from this. So you identify the need for a review, what is the need? And then you develop a protocol. This protocol is a little bit exciting. If you like to write, if you like to think uh people have said that you write so you can hear yourself think. So developing a protocol would help you understand whether you're thinking properly about the things that you're doing, develop your literature review search strategy. So remember we said the definition of systematic reviews is a systematic review has a definite defined methodology and it is systematic. So you need your search strategy, your literature has to follow a systematic or you're using systematic to describe systematic. OK. Ver is a systematic comprehensive strategy and when you're done with that, it's like throwing a net into this ocean. And then when you grab all the fish and plastic because there's now more fish, more plastic than fish in the oceans. Uh So if any of you is interested in cli climate change and, and so on, you can go to your rivers and count the amount of pure SAS there for your final study. Um And uh and, and, and then you now say, OK, we don't want plastic, we want to fish. So you, you, you collect this only the studies that meet your clearly defined criteria, you extract the data. Remember we talked about quality, quality bias assessment and data synthesis. Yeah, you know, you could go in into a systematic review and then of course you create your, you your, your report and recommendations, apply evidence to practice. Research is for decision making, decision making is for po is, is is for population health. The goal of population health is to improve longevity and quality of life. Keep that at the back of your mind. I'm I'm I'm happy to, to the academics, but there's also what is the imperative of you going to school, becoming a doctor? And it is this to apply to practice, to improve population health. And population health is increasing longevity and the quality of life. Uh Shane. Do you know what that is when you increase quality of life and longevity? There is a health metric or there are several health metrics or matrices. Uh that capture those two things together? Do you know what they are called? Can you, can you give me one example? And if you don't know one of the best answers you can give is you don't know. Someone asked me, how do you calculate heterogeneity? I said I have no idea. Nobody is going to beat me, you know, going to take my phd from me. You're not going to say OK, now you did not graduate in 2008. You graduated in 2019 because you don't know, I don't know. Do you know? No, sir. Ok, good. So you've heard about qualities? Dallies ly, there are so many of them, everybody is coming up with the good with good ones and trying to improve on the qualities, but the qualities are more widely used. Uh They keep dallies for Africa and uh developing countries. So you guys have to rise off and say no, we don't want dallies, we want qualities or, you know, make dallies great and then everyone would jump back to dallies. Um ok, so, so identify the need for a review. So this is something that I really, really like a PCO framework, you probably have seen it. So we say that you need a definite research question. So if you need a definite research question, how do you formulate a definite research question? And one of the things you can do is use this PCO framework, the piece population of participants intervention or exposure. So it's po pe op co you know, there's always all the, so, so, so you're back on online, ask you rapid fire if you know, you know, if you don't know, you say, I don't know, we keep it moving. We have 20 more minutes. C is comparator outcome timing. OK. So, so look at these, she tell me, does regular exercise lower the risk of stomach cancer among adults? Which of them is the P which is the I or E, which is the C, which is the O, which is the T in this sentence. Let's go quickly. I think, um, um, the regular exercise, post question, I think it's the comparator. The reg regular exercise is the comparator. Regular exercise is an intervention. Intervention. Yes. If someone comes to you and they are overweight, what do you say? You say? I now pronounce you 10 pushups every morning. That's your intervention. Ok. What are risk of stomach cancer? Oh, so that the risk of stomach cancer, it's the outcome nice. Ok, so you have 50% success and then adults, adults is the population of participants. Exactly. So if this person said risk of stomach cancer, you'd be among what? Chickens? Among goats, among Children? No, among adults. Ok. So, I don't know. Is 10, is still there? She said she was having, um, ok, so Tenny, you are on. So, oops, let's go back. So can you tell us um how would you improve this question? One to include timing? Um You could add like a time period. Now add it. Now say you could add it, you add it, tell us. Ok. Um Does regular exercise lower the risk of stomach cancer among adults over a period of five years? Ok. Good, good, good. Let's improve that. Right. So let's imagine that. Let's imagine that um between 2015 and 2022 yoga exploded in Nigeria. So you would say something like it since 2015 has the introduction of yoga reduced soma risk of stomach cancer among adults who practice yoga something, you know, so, and, and, and the reason timing becomes important is there are, when you look at, especially in Clin commenting at some point, there are two things. One is that there is what you call a cohort effect. So there's this beautiful, er, article by Jeffrey Ros, er, the article is so good. They republish it almost every year. I could, I could send the link uh to, to the team. Um Yes, we've used um 20 minutes on the OK. So, so these are final 20 minutes we're going to run. Ok. So, and, and, and, and, and he notes in that paper right, that some years ago and, and now it could be 60 years ago, the occurrence of duodenal ulcers was very, very prevalent a generation later. For some reason, we have no idea of the occurrence of Gegen ulcers are not very prevalent and we don't know why. And that's an example of a cohort effect. So if you're looking at the question, you might want to say, OK, let's only limit it to people who grew up in this time because we want to capture uh want to reduce the cohort effect. I know that idea for timing is maybe at some point they were treating cancer with only, you know, the uh, the chemotherapies. Now they have targeted therapies. So you want to look at only a period where there were targeted therapies. Ok, let's move. We're moving quickly check if a review has been done before and whether an update is needed. This is a very good point because you can say the question is, is this review needed? Well, what if somebody has done this question? Does the risk of A I DS differ among sex by sex among HIV? Maybe they did it up to 2026. Remember, look at your intervention. H A highly active blah, blah, blah. Maybe they had study search stopped at 2016 between 2016 or 2014. And the new HIV drugs have come on the market another quick PLO. So I don't know, I no longer know the drugs used to treat diabetes. But I was very shocked when I came into the US that the treatment landscape for diabetes was so much more advanced than what we have. My father in law has like an insulin pump that's attached to his like abdomen and he can read his insulin at any point, you know, and it is, and, and the app is tied to my wife's phone so my wife can see when his insulin and all of that stuff and keep, um, it just blows my mind like, why don't we have these same treatments? And it's something, remember that population health is to improve longevity and quality of life. It's something that you can say this is the goal that I would try to accomplish with my life with my work to make sure that these innovative medicines can be used in uh in low income countries. But again, if a study has been done, you might want to update it, maybe you want to update it with a broader question or you want to update it with time and then look at the question of subgroups. Is there a subgroup of interest? So if your question is on pregnant women, someone might say, why don't we look at women who started a NC in their first trimester versus those who started in their third trimester? OK. So you developed the protocol. So remember you say you, so look at this. OK. Iframe and I, I think I like this. Think about your study design, then write it down, you write it down so that you know whether what you're thinking is clear when you write down your protocol, what it does for you is it helps you see if there are gaps in your thinking about the research. And one thing that people want or stakeholders want is that you register. So we say consider registering, but the truth is you should register, don't just consider I do it so that people can know that you started to answer question X but you ended up answering question why they want to know that you are not doing uh to the answer. And of course, you register with Prospero. So this is a uh uh a site where you can register your, your, your studies with. OK. So there's something called Prisma. I told you the boring stuff. So Prisma is like a another log. So there are these organizations that tell you what best practices are, I don't know through throughout my time, medical stand, it's been a long time. So maybe things have changed. I doubt it very much. Nothing changes very much or nothing has changed very much in this country. Um When I was in Ong University of Lagos, there was an algorithm for the treatments. It was posted on the walls for the treatment of patients with preeclampsia. That's the closest I've come to best practice uh uh dissemination. You know, there are these organizations, I say if you want to do a, a menas systematic review, these are the, this is the, this is a checklist. Um Someone won um genius award for creating checklist. For surgery. Um, and, and you know, and, and, and yeah, of course, it's, it's a, it's a, it's a scene of genius. But you, when, when you think about, when you think about these people create these organizations create best practice list and see when you want to write a protocol or when you want to do a systematic review. These are the things that you do and this prize map, it's a website. Um and you can, and you use that, you see if registered, provide the name. So they talk about Prospero as well. And, and these are the steps that you have to take when you're writing your protocol and, and on and on and on it goes, you know, it talks about your set strategy. We talked about that. Let's talk about set strategy because that's really important. You can always look at this er in your time. OK? So develop your stress search strategy. OK? So remember we talked, it's about A P CO. So I'm sure you guys know you guys seem to be a very educated bunch. Um You, you already know that, that you already know how to conduct searches. Uh something I will point out here is that when you do your searches, you want to use your P code to search for your paper. So how do you, let's talk about exercise and stomach cancer, right? So if you just write typing into Google exercise and stomach cancer, a lot of papers will come out. But there's a systematic way for one exercise using the mesh tense. We'll not talk about what the mesh term is. Um, exercise is referred to as motto activity. So if there are studies that talk about motto, the effect of motor activity on stomach cancer, you're going to miss that because you use the exercise, right? So you're going to type in your search bar moto activity. You find the mesh term physical activity. Also another word for exercise or exercise and stomach. So stomach can be stomach can be gastric can be upper G I blah, blah, blah, and cancer can be cancer or neoplasms. So you get all these words for your PCO. So let's look at, let's look at this. Yes. So you get all these words for your PCO and you put it in a table like this. So this is your intervention. Stomach cancer is your outcome. Adult is your population. So you can have your PCO and then you capture what are the different words? So exercise, physical activity, running the weight lifting, yoga, bicycle, bicycling, whatever you capture all those words according to your PCO. And then you create your search strategy in your the the most common is Pob. Although these days people are doing a lot of Google scholar because the market has gotten into science and sometimes they're actually better. Uh this is recorded. So I'm on camera saying it. But OK, Google scholar, everyone is using Google scholar. I mean, you're professors. Um So, so you do that, you put this into your, your PUBMED search and then that's where you're going to get a list of your studies. OK. The next step is to upload your citations into a system, you can upload into Excel. If you use PUBMED, you can download your Pob Med into a CS V file, which is really an Excel file. And then you would have that. There's also something called Ryan Rayyen and I like Ryan. It's created by the Qatari government. And um yeah, it's, it's really good. You can look at the different studies there. You can read the abstract and understand whether these studies are related to your specific question that you use creating the PCO. So this is the cut out um of this is a cut out of. So when someone has put in their studies into a, an Excel page, but I'm going to pause here. And so someone said Ryan, OK, that means you guys use Ryan so good. And then in the Q and A, someone wants to ask a question on meta analysis. Um So, so here to uh she the bio, if in instead of saying please, I want to ask a question of meta analysis, why don't you just type in the question directly so that we keep the conversation to one level? Um So that's, that's something you, you do to assert yourself when you get into the corporate world, don't ask for permission to ask, ask already. Uh That's not an issue. That's the advice I'm giving you as a I can call myself your senior brother. It's, it's good to learn that. So you put all of this in. Oh, yes, please go ahead. Interrupt me. Uh I'm, I'm very sorry for interrupting. No, no, no, no. I said, go ahead and interrupt. So just ask a question. OK. Please. I just want to ask is uh meta-analysis a form of uh experimental uh research. No, it's not. So in experimental research, you are doing something right? You're experimenting here, you're really just collecting data together and analyzing the data. OK. So can we now call it non experimental research? So I don't know because like I said, I don't, I don't know and I don't like definitions. So if you call it's not experimental. So, so clearly it's non experimental. But what is the point of giving it a name? Is this, is this a question that someone's going to ask you for your exam? OK. So if the, if your exams are going to say name, two types of non experimental, I guess you can, you can call it non experimental. But I don't, I don't know that if it's an exam question, then yes, you can call it non experimental. But if it's not an exam question, I wonder why do we have to classify it? What is the importance of classifying it is. What's the implication? I, I have no idea. So, so maybe the short answer is yes. Ok, sir. Thank you very much, sir. You are welcome. Ok. So let's move. So this is what Tero looks like. I used Otero once. Oh, by the way, I know that. Um what's his name? Oh, don't tell me one of your speakers. Oh God. A follow on me. I've worked with him once. Excellent guy, really great guy and we use ZO um and tell him, I said hi. Uh um after, when he went to do his uh we lost contact after he went to do his uh his master's. OK. Um So this was OK. So there's also a flow chart by the time you are done with, let's should we skip this? No, let's not skip this. But we will come back to this because this is, I feel this is a little misplaced. Let's go here. OK, good. So, after you've gotten all your studies into your Excel, right? You're going to look at, remember your inclusion criteria, you're going to say this is our outcome, this is our population. So you're going to see if this po if this study is looking, for example, there are in vivo studies and in vitro studies, right? Like if this study was looking at exercise, stomach cancer, in adults, in adult mesenchyme cells, blah, blah, blah. So it's not in the real adults, but they put it into a test tube. They're going to say that's part of our exclusion criteria. We're not going to review that. And so when you do, when you put all those, your studies in your Excel, you have to judge them by your inclusion and exclusion criteria. So you might also say, so remember that she will say that um sometimes a systematic review may be limited to RCTS. Yes, but not all the time. So you might say I exclusion. We don't want to do animal studies. We don't want to do case series, we don't want to do any observation or we just want to do RCTS. And then you say any study that is not an RCT, we will exclude it. OK? So uh study selection is done in two rounds. So you typically two people do it blindly and this is why Ryan is good. Two people can have access to the same thing and then they would look at the data and tell you whether it meets the exclusion or inclusion criteria according to their peak code. And this is why a protocol is really good because the protocol helps you run out all these things about inclusion and exclusion instead of just doing a PCO on your table and moving forward. So look at the reviewer, one reviewer, two give different scores you can see here that maybe zero means don't include one means include reviewer, one set, include review, two set don't include and after they have done it blind to each other, they can get either a third person to break the tie or they sit down and discuss. So look at this review, three or reconciled says, let's call it one, right? OK. So we've talked about the reconciliation of studies and you were just reading the abstract. That's round one, round two. When you take out the studies that are ineligible, you now go to the full text review. So now you have to read the whole thing and you read the whole thing and do two things, you remove, you continue to exclude because maybe the abstract was not good enough. So you exclude, but also you use that for your data extraction. OK? So and, and sometimes some people publish the from the same data set, they publish different studies or they use the same data set and publish the same study in different ways. So if it's different studies, you, if it's different studies like different populations, it's fine. But if it's the same data set, the same, the actual same data set, the same question, then you can con consult the data and say, OK, in this paper, they did not answer XYZ but in this other paper, they answered them, let's merge instead of excluding. So that's something you do when you are reading the full study. Now. So we come here, you see, I told you that we, we kind of misplaced it. You have to create a flow chart on the left. You can see where you started from. You found Google Scholar 500 po made 1500. But now why did you remove duplicates 1500 or 1000? Those screen those reports, those all the way to the final number included. It's best practice to do this so that your people, your readers, your audience stakeholders can follow you. OK? We have five more minutes. So we remember we said you're reading the papers in full so that you can extract. What are you extracting? Look here, the author of the year, the reviewer, look at that's doctor um the country way was done. Uh So in this case, you sometimes you want to see was this study done in Europe? Was it done in the US? Was it done in Africa? Um the population, your PCO timing? Look at the timing. What year did the study come sample size? So this is also part of your population. What was the case? What was the control, demographic uh and clinical variables, their age, their blah, blah blah gestational period. You get all the information you're going to need to do your review or your meta analysis. OK. What to do about missing data? So this is a chart that you can keep in mind. It is it is really how do you convert data from one form to another? Not really about missing data per se about missing data. You can write to the author and say we, we realize that data was missing blah, blah blah. Can you help us understand? Uh there's nothing much you can do. It's already published. Excuse me? OK. So study quality. This is where we are going to spend so quick but really good minutes to talk about study quality. We talked about study quality earlier. A good study should not be given the same weight as a bad study. OK, so the these are the different tools that you can use. Many, a lot of people use the coin. So coin has for randomized trials. This is the best practice. They also have for observational studies. Best practice. How do you capture bias in a study? And when you go through, let's look very quickly. Now let's go back very quickly. So you want to look at, let's talk about observational studies, not RCTS. You want to look at step bias from confounding. Is there a way that they selected the participants from the studies? That's not so great. How do they classify interventions? The first and only systematic review I wrote were looking at tea and how it affects uh cardiovascular disease. And of course, the question is, what do you mean by tea? Do you mean tea boiled? Do you mean one cup of tea, one cooler of tea, six cups of tea, tea in the morning tea in the night. So, so these are the different things that can bias your results, right? Like selective reporting of measurement of outcomes is what I was just talking about, but also selective reporting. So some people say, 01 look at on cardiovascular outcomes but you find out that triglycerides is missing and you say why it's that's a cardiovascular outcome. And when you look, excuse me, when you look at your, when you look at these studies, you have to score them. So look on the left here, you have Carter paper for randomization. The reviewers think that it is questionable for blinding of participants. They think that it was poorly done for blinding of outcomes and all other things that Cochrane wants you to measure whoever you're using, they give it a plus plus plus plus and you can do the same thing on the right and look at all through all of them and say 70%. I don't know, let's call this what let's call this 85%. 85 of our studies had very strong, positive, very strong sele uh prevention of selection bias. Whereas only 10 or 15% of our studies had very good blinding. So that when you do all of that, it gives you a sense. But this is a little subjective because she might say that means we have overall good quality. But 10, you might say well with this negative for blinding, I would say our studies are maybe moderate or not so good it's subjective. Yes. But that's something that you have to, that you have to take. So, oops, excuse me, uh, let's see, risk of bias. So this is exactly what we're talking about the same study. So, after you're done, you see looking at this study based on all this stuff we've looked at here, I'm going to see the risk of bias for this study was moderate low or high. And so when you are, when you are, so I'm going to pause here, we're going to call time and then we're going to leave it for questions and maybe I can go into the meta analysis part um because this is a, a lot to teach. So when you're doing your, remember that your systematic analysis, your systematic review is a summary of results. It usually has no statistical analysis. So you're going to be able to say we review the results, this is what our findings look like. And we believe that the findings, we have high confidence because not only did we find XYZ as the truth, we found that 70 or 80% of those studies were very high quality. So it's a little so uh systematic reviews are a little bit subjective and that's why people want to go in next step and say, OK, let us pull all of this and find the e and put numbers to our findings. So you can say we found, for example, that calcium reduces serum ferritin to be true based on the results of 10 papers, 80% of them were high quality studies. OK? So let's call time. Um There's still some more to do a meta analysis, but I'm going to call time and if you guys have to go. Yes, then let's take questions and then let's wrap it up. Maybe in future we send you the slides and you can, you can, you can follow the uh meta analysis portion of this on your own. But, and that's up to you. Any questions? Are you guys there? Yes. Uh We can hear you. So OK, so yes, I said, I said we have time. Um So let me think to people have their hands raised. So OK, so, so can let's have them ask questions there. Can you unmute the people? And then, and Jeremiah Daniel, you can unmute and ask your question.