Nothing Special   »   [go: up one dir, main page]

IDEAS home Printed from https://ideas.repec.org/a/oup/emjrnl/v22y2019i2p188-205..html
   My bibliography  Save this article

A simple, graphical approach to comparing multiple treatments

Author

Listed:
  • Brennan S Thompson
  • Matthew D Webb
Abstract
SummaryWe consider a graphical approach to comparing multiple treatments that allows users to easily infer differences between any treatment effect and zero, and between any pair of treatment effects. This approach makes use of a flexible, resampling-based procedure that asymptotically controls the familywise error rate (the probability of making one or more spurious inferences). We demonstrate the usefulness of this approach with three empirical examples.

Suggested Citation

  • Brennan S Thompson & Matthew D Webb, 2019. "A simple, graphical approach to comparing multiple treatments," The Econometrics Journal, Royal Economic Society, vol. 22(2), pages 188-205.
  • Handle: RePEc:oup:emjrnl:v:22:y:2019:i:2:p:188-205.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/ectj/utz006
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Anderson, Michael L, 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Department of Agricultural & Resource Economics, UC Berkeley, Working Paper Series qt15n8j26f, Department of Agricultural & Resource Economics, UC Berkeley.
    2. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    3. William C. Horrace & Peter Schmidt, 2000. "Multiple comparisons with the best, with economic applications," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 15(1), pages 1-26.
    4. Philip Oreopoulos & Daniel Lang & Joshua Angrist, 2009. "Incentives and Services for College Achievement: Evidence from a Randomized Trial," American Economic Journal: Applied Economics, American Economic Association, vol. 1(1), pages 136-163, January.
    5. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    6. Steven F. Lehrer & R. Vincent Pohl & Kyungchul Song, 2022. "Multiple Testing and the Distributional Effects of Accountability Incentives in Education," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 40(4), pages 1552-1568, October.
    7. Anderson, Michael L., 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1481-1495.
    8. G�nther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    9. White, Halbert, 1980. "A Heteroskedasticity-Consistent Covariance Matrix Estimator and a Direct Test for Heteroskedasticity," Econometrica, Econometric Society, vol. 48(4), pages 817-838, May.
    10. Davidson, Russell & MacKinnon, James G., 2010. "Wild Bootstrap Tests for IV Regression," Journal of Business & Economic Statistics, American Statistical Association, vol. 28(1), pages 128-144.
    11. Joseph P. Romano & Michael Wolf, 2005. "Exact and Approximate Stepdown Methods for Multiple Hypothesis Testing," Journal of the American Statistical Association, American Statistical Association, vol. 100, pages 94-108, March.
    12. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    13. Jiaying Gu & Shu Shen, 2018. "Oracle and adaptive false discovery rate controlling methods for one‐sided testing: theory and application in treatment effect evaluation," Econometrics Journal, Royal Economic Society, vol. 21(1), pages 11-35, February.
    14. Christopher J. Bennett & Brennan S. Thompson, 2016. "Graphical Procedures for Multiple Comparisons Under General Dependence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(515), pages 1278-1288, July.
    15. Soohyung Lee & Azeem M. Shaikh, 2014. "Multiple Testing And Heterogeneous Treatment Effects: Re‐Evaluating The Effect Of Progresa On School Enrollment," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 29(4), pages 612-626, June.
    16. Karthik Muralidharan & Venkatesh Sundararaman, 2011. "Teacher Performance Pay: Experimental Evidence from India," Journal of Political Economy, University of Chicago Press, vol. 119(1), pages 39-77.
    17. MacKinnon, James G. & White, Halbert, 1985. "Some heteroskedasticity-consistent covariance matrix estimators with improved finite sample properties," Journal of Econometrics, Elsevier, vol. 29(3), pages 305-325, September.
    18. Alwyn Young, 2019. "Channeling Fisher: Randomization Tests and the Statistical Insignificance of Seemingly Significant Experimental Results," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 134(2), pages 557-598.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sebastian Jobjörnsson & Henning Schaak & Oliver Musshoff & Tim Friede, 2023. "Improving the statistical power of economic experiments using adaptive designs," Experimental Economics, Springer;Economic Science Association, vol. 26(2), pages 357-382, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    2. Jeffrey D. Michler & Anna Josephson, 2022. "Recent developments in inference: practicalities for applied economics," Chapters, in: A Modern Guide to Food Economics, chapter 11, pages 235-268, Edward Elgar Publishing.
    3. Young, Alwyn, 2019. "Channeling Fisher: randomization tests and the statistical insignificance of seemingly significant experimental results," LSE Research Online Documents on Economics 101401, London School of Economics and Political Science, LSE Library.
    4. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    5. Agostinelli, Francesco & Avitabile, Ciro & Bobba, Matteo, 2021. "Enhancing Human Capital in Children: A Case Study on Scaling," TSE Working Papers 21-1196, Toulouse School of Economics (TSE), revised Oct 2023.
    6. Steven F. Lehrer & R. Vincent Pohl & Kyungchul Song, 2022. "Multiple Testing and the Distributional Effects of Accountability Incentives in Education," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 40(4), pages 1552-1568, October.
    7. Belot, Michèle & James, Jonathan & Spiteri, Jonathan, 2020. "Facilitating healthy dietary habits: An experiment with a low income population," European Economic Review, Elsevier, vol. 129(C).
    8. Davide Viviano & Kaspar Wuthrich & Paul Niehaus, 2021. "A model of multiple hypothesis testing," Papers 2104.13367, arXiv.org, revised Apr 2024.
    9. Azevedo E Castro De Cardim,Joana & Amaro Da Costa Luz Carneiro,Pedro Manuel & Carvalho,Leandro S. & De Walque,Damien B. C. M., 2022. "Early Education, Preferences, and Decision-Making Abilities," Policy Research Working Paper Series 10187, The World Bank.
    10. Sandner, Malte & Cornelissen, Thomas & Jungmann, Tanja & Herrmann, Peggy, 2018. "Evaluating the effects of a targeted home visiting program on maternal and child health outcomes," Journal of Health Economics, Elsevier, vol. 58(C), pages 269-283.
    11. Cygan-Rehm, Kamila & Karbownik, Krzysztof, 2022. "The effects of incentivizing early prenatal care on infant health," Journal of Health Economics, Elsevier, vol. 83(C).
    12. Islam, Asad & Kwon, Sungoh & Masood, Eema & Prakash, Nishith & Sabarwal, Shwetlena & Saraswat, Deepak, 2020. "When Goal-Setting Forges Ahead but Stops Short," GLO Discussion Paper Series 526, Global Labor Organization (GLO).
    13. Joana Elisa Maldonado & Kristof De Witte & Koen Declercq, 2022. "The effects of parental involvement in homework: two randomised controlled trials in financial education," Empirical Economics, Springer, vol. 62(3), pages 1439-1464, March.
    14. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    15. Timothy B. Armstrong & Shu Shen, 2013. "Inference on Optimal Treatment Assignments," Cowles Foundation Discussion Papers 1927RR, Cowles Foundation for Research in Economics, Yale University, revised Apr 2015.
    16. John A. List & Azeem M. Shaikh & Atom Vayalinkal, 2023. "Multiple testing with covariate adjustment in experimental economics," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 38(6), pages 920-939, September.
    17. Arouna, Aminou & Michler, Jeffrey D. & Lokossou, Jourdain C., 2021. "Contract farming and rural transformation: Evidence from a field experiment in Benin," Journal of Development Economics, Elsevier, vol. 151(C).
    18. Georg F. Camehl & C. Katharina Spieß & Kurt Hahlweg, 2019. "Short- and Mid-Term Effects of a Parenting Program on Maternal Well-Being: Evidence for More and Less Advantaged Mothers," SOEPpapers on Multidisciplinary Panel Data Research 1062, DIW Berlin, The German Socio-Economic Panel (SOEP).
    19. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    20. Rodríguez-Planas, Núria, 2017. "School, drugs, mentoring, and peers: Evidence from a randomized trial in the US," Journal of Economic Behavior & Organization, Elsevier, vol. 139(C), pages 166-181.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:emjrnl:v:22:y:2019:i:2:p:188-205.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://edirc.repec.org/data/resssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.