By: Dr. Ronald C. Jones
Quantitative researchers face challenges regarding the minimum sample size. The two primary issues involve correctly calculating the appropriate sample size and enlisting the required number of participants. Jones and Lentz (2013) stated, “The challenge of the doctoral quest lies in how to look at the options available to decide the most appropriate path to travel” (p. 64). By reviewing the manner in which Jones (2013) overcame sample size challenges, future doctoral scholars might find the journey a little less daunting.
The Internet contains numerous sample size calculators, yet few include understandable instructions for a doctoral student to be able to determine the minimum sample size with confidence. Jones (2013) used a simple sample size calculator which indicated 26 participants would fulfill the requirement; however, the university methodologist astutely challenged the ability of such a low sample to produce statistically significant results. Through the encouragement of Dr. Cheryl Lentz, Jones’ doctoral committee chair, Jones contacted an expert statistician from a local university. The statistician recommended and provided instructional advice regarding G*Power 3.1.5 sample size calculator, developed by Faul, Erdlelder, Buchner, and Lang (2009). The G*Power 3.1.5 calculator is simple, yet statistically sophisticated in that the sample size is specific to the planned testing procedures (Faul et al., 2009). With Pearson’s Product Moment Coefficient (r) and Spearman’s Rho designated as the statistical testing procedures, calculating the appropriate and justifiable sample size of 67 became unproblematic. Numerous scholarly studies indicated one should not expect greater than a 20% survey response rate. Recognizing that obtaining the minimum sample size is a function of response rate, Jones calculated the target population by simply dividing the minimum sample size by the expected response rate (67 / .20 = 335). To align with the geographic region, Jones settled on a target population of 348.
The second challenge involved collecting data from at least 67 participants. Jones (2013) chose the Multifactor Leadership Questionnaire (MLQ) to serve as the survey instrument. Mind Garden, Inc., the licensor of the MLQ, offers doctoral researchers the use of its Transform Online Survey system (Bass & Avolio, 2004). Examining the relationship between leadership style and financial performance embodied the purpose of Jones’ study. To be able to test the correlation, Jones needed to be able to identify, yet maintain confidentiality of the participants. To use the Transform Online Survey system, participants must create a username and password to enter the survey. Jones’ doctoral chair provided frequent warnings regarding response rates from busy, organizational leaders. Doubts began to emerge regarding the willingness of an adequate number of leaders to complete the survey through the online venue .
For Jones (2013) to proceed with confidence, obtaining assurances from Mind Garden, Inc. of an expected response rate was a must, yet no such assurances were available. Jones faced a quandary and needing a pathway to overcome the potential obstacle. Deciding to press forward without any assurance of obtaining the minimum sample size, Jones proceeded to the oral defense of the proposal. During the oral defense, Dr. Michael Ewald, the second committee member, questioned the data collection method. Dr. Ewald openly shared personal, academic, and professional experiences of conducting thousands of surveys over numerous years. Recognizing Dr. Ewald’s expert knowledge, Jones gleaned invaluable information from the ensuing discussion. Jones knew the opportunity to overcome the obstacle was at hand. Dr. Ewald shared experiences of 3-5% response rates to Internet surveys and 20-30% response rates to mailed surveys. Jones changed direction, mailed the survey along with a personalized invitation to participate to the 348 target general managers, and attained the minimum sample size of 67 eight days after mailing. With the initial collection period set at 20 days after mailing, a total of 102 leaders participated, representing a 29% response rate. Although data collection closed after the 20 day period, an additional 42 surveys arrived over the next few weeks, representing a total response rate of 41% or how many?.
Doctoral research involves challenges. “Completing the capstone study of a doctoral degree is a demanding process, with plenty of obstacles. Maintaining a philosophy of never giving up regardless of the challenges, required revisions, or academic criticisms is essential” (Jones & Lentz, 2013, p. 68). Qualitative researchers face challenges, yet enjoy the benefits of interviewing or questioning approximately 20 participants to move forward with their study. Quantitative researchers face additional challenges regarding calculating and attaining the minimum sample size. Jones (2013) realized success by remaining determined, yet flexible, in the pursuit of a doctorate. By listening, seeking expert advice, cooperating with doctoral committee members, and diligent work, future doctoral scholars achieve the pinnacle of academic success.
Bass, B., & Avolio, B. (2004). Multifactor leadership questionnaire: Manual and sample set (3rd ed.). Redwood City, CA: Mind Garden.
Faul, F., Erdlelder, E., Buchner, A., & Lang, A. (2009). Statistical power analyses using G*Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods, 41, 1149-1160. doi:10.3758/BRM.41.4.1149
Jones, R. C. (2013). Examining leadership styles and financial performance within rural electric cooperatives (Doctoral dissertation). Available from ProQuest Dissertations and Theses database. (UMI No. 3561539).
Jones, R. C., & Lentz, C. A. (2013). Researching close to home: Using refractive thinking to examine professional colleagues. The Refractive Thinker®: An Anthology of Higher Learning: Vol II, Research Methodology, 57-74. Las Vegas, NV: The Refractive Thinker® Press.