RANKING SELECTED R1 UNIVERSITY DOCTORAL
QUANTITATIVE METHODOLOGY PROGRAMS
By
GABRIEL ATTAR
DISSERTATION
Submitted to the Graduate School of
Wayne State University,
Detroit, Michigan
in partial fulfillment of the requirements for the degree of
DOCTOR OF PHILOSOPHY
2021
MAJOR: EDUCATIONAL EVALUATION
AND RESEARCH
Approved By:
_________________________________________
Advisor Date
_________________________________________
_________________________________________
_________________________________________
© COPYRIGHT BY
GABRIEL ATTAR
2021
All Rights Reserved
ii
DEDICATION
״עבה
תידוהיו תפי, םירקיה ירוהל שדקומ ראוטע , ץראל ולעש לארשי ״םימסקה דברמ״ עצבמב ןמיתמ םישנא
ילעבו םיבוט םיעונצ ןוזח הנומאו , .ךורב םרכז יהי
A debt of gratitude to my parents Yefet and Yehudit Atuar for their unyielding
perseverance during their most difficult time and for raising me and my four younger brothers with
fundamental values and principals, for without which, we’d have been unprepared to overcome
the multitude of obstacles our life’s circumstances dictated we endure. Their memory is with me
all the time. May their memory be blessed.
To my Rabbi, Mentor, and Advisor
Rabbi, Dr. Shlomo S. Sawilowsky
Of blessed memory
iii
ACKNOWLEDGMENTS
I wish to thank my loving and supportive wife, Marilyn, and my two wonderful daughters,
Michelle and Ashley, who provided unending support and inspiration.
I am especially indebted to my advisor, mentor, and teacher, Prof. Shlomo Sawilowsky,
who has instructed me with patience and gifted me his knowledge, wisdom, and support
throughout my dissertation endeavors. He provided many hours of academic advisement and was
continually there to direct me. His unwavering guidance gave me the encouragement to fulfill my
career goal of earning a Ph. D. For this, I would like to express my deepest gratitude, for without
him, I would not be able to finish what I started five years ago.
I would be remiss if I did not give my sincerest appreciation to my teachers that contributed
to knowledge, each in their own field, and who have shown me their commitment and endorsement
as members on my committee. I would like to thank Prof. Barry Markman for effectively teaching
me the fundamentals of statistics, Dr. Sarah Raphan for showing me how critical the role of
measurements plays in statistics and research, and Dr. Kevin Carrol for teaching me to use
computer technology for conducting research. It has been an honor and pleasure to work with all
of them.
iv
TABLE OF CONTENTS
Dedication…………………………………..……..………………………………………
ii
Acknowledgments………………………………….……………………………………..
iii
Preface……………………………………………….……………………………………
iv
List of Tables……………………………………………………………………………...
vi
List of Figures……………………………………………………………………………..
vii
Chapter 1 Introduction……………………………………………………………………
1
Chapter 2 Literature Review..…………………………………………………………….
7
Chapter 3 Methodology Review……...…………………………………………………...
55
Chapter 4………………………………………………………………….……………….
Results………………………………………………………………………...….
33
Chapter 5 Discussion……………………………………….……………………………..
75
Appendix A……………………………………………………………….……………….
80
References…………………………………………………………….….……..…………
102
Autobiographical Statement…………………………………...……………….…………
105
LIST OF TABLES
Table 1: Nobel Winners in Medicine/Physiology for 1997–2006 ………………...……. 22
Table 2: Names of 22 Other Universities ………………………………………..…….. 23
Table 3: Boston College Measurement, Evaluation, Statistics, and Assessment ..………... 33
Table 4: Brigham Educational Inquiry, Measurement and Evaluation……………………. 34
Table 5: Claremont Graduate University Evaluation & Applied Research Methods……… 34
Table 6: Columbia Educational Measurement, Evaluation, & Statistics………………..…. 35
Table 7: Florida State University Measurement and Statistics…………………………….. 35
Table 8: George Mason University Research Methodology……………………………….. 36
Table 9: Kent State University Evaluation and Measurement………………………....... 36
Table 10: Ohio State Quantitative Research, Evaluation and Measurement………………. 37
Table 11: University of California, Berkeley Social Research Methodologies..……….. 37
Table 12: University of North Carolina Greensboro Educational Reasrch Methodology…. 38
Table 13: University of Boulder Colorado Research & Evaluation Methodology………… 38
Table 14: University of Connecticut Research Methods, Measurement, and Evaluation. 39
Table 15: University of Florida Research and Evaluation Methodolog…………………… 39
Table 16: University of Illinois Chicago Measurement, Evaluation, Statistics, and
Assessment………………...………………………………………………….… 40
Table 17: University of Illinois Urbana Quantitative and Qualitative Methodology,
Measurement, and Evaluation…………………………………..……………. 40
Table 18: University of Iowa Educational Measurement and Statistics…………………… 41
Table 19: University of Kentucky Quantitative and Psychometric Methods……………… 41
vi
Table 20: University of Tennessee Evaluation, Statistics & Measurement……………... 42
Table 21: Virginia Research, Statistics & Evaluation……………………………………... 42
Table 22: University of Washington Measurement & Statistics………………………….. 43
Table 23: Washington State University Educational Psychology…………………………. 43
Table 24: Wayne State University Education Evaluation & Research…………………….. 44
Table 25: Western Michigan University Educational Leadership, Research and
Technology……………………………………………………………………… 44
Table 26: Mean h-index, g-index, Number of Citations and Publications for 23
Universities……………………………………………………………………… 58
Table 27: Median h-index, g-index, Number of Citations and Publications for 23
Universities……………………………………………………………………… 59
Table 28: Mean h-index for 23 Universities……………………………………………….. 59
Table 29: Mean g-index for 23 Universities……………………………………………….. 60
Table 30: Mean Number of Citations for 23 Universities………...……………………….. 61
Table 31: Mean Number of Publications for for 23 Universities………………………….. 61
Table 32: Median h-index for 23 Universities…………………………………………….. 62
Table 33: Median g-index for 23 Universities…………………………………………….. 63
Table 34: Median Number of Citations for 23 Universities……………………………….. 63
Table 35: Median Number of Publications for for 23 Universities………………………... 64
Table 36: The Standardized Mean h-index for 23 Universities……………………………. 65
Table 37: The Standardized Mean g-index for 23 Universities……………………………. 65
Table 38: The Standardized Mean Citations for 23 Universities………………………... 66
Table 39: The Standardized Mean Publications for 23 Universities………...………..…… 67
vii
Table 40: The Standardized Median h-index for 23 Universities……………………….. 67
Table 41: The Standardized Median g-index for 23 Universities……………………..…… 68
Table 42: The Standardized Median Citations for 23 Universities…………………………68
Table 43: The Standardized Median publications for 23 Universities………………..…… 69
Table 44: Overall Standardize Mean……………………………………………….……… 70
Table 45: Overall Standardize Median……………………………………………..……… 70
Table 46. Raw, Inverse and the normal scores of faculty size……………………………..72
Table 47: Final Standardized Ranking Based on the Median……………………………… 72
Table 48: Final Standardized Ranking Based on the Mean……...………………………… 73
Table 49: Mean h-index, g-index, Number of Citations, and Publications for 23
Universities……………………………………………………………………… 77
Table 50: Mean Standardized Ranking of h-index, g-index, Number of Citations,
Number of Publications and Tenure/Tenure-Track Faculty..……………... 78
Table 51. Breakdown of Affiliation by Committee Member Position…………………….. 89
Table 52. EER Doctoral Dissertations, 1949 2019………………………………………. 91
Table 53. EER Advisor by Number of Dissertations………………………………………. 97
Table 54. EER Advisor by Doctoral Degree Type……………………………………… 97
viii
LIST OF FIGURES
Figure 1. Cubic Curve Fit of the h-index to Vumber of Citations………………………….75
Figure 2. EER Dissertations by Year and Trend Line………………………………...…… 98
1
CHAPTER 1
Introduction
In 1881, the Detroit Normal Training College (DNTC) became the second college,
following the Detroit Medical College in 1868, of what was to be named Wayne University (WU)
in 1934 and renamed Wayne State University (WSU) in 1956. In 1923, the DNTC became a four-
year university and began granting undergraduate degrees in 1925. The master’s degree programs
began in 1930.
The events leading to the establishment of the doctoral program in Wayne State
University’s College of Education (COE) started with Dean Waldo E. Lessenger. Due to his
efforts, the faculty of the College of Education was authorized to develop a program leading to the
doctoral degree. It came to fruition in May 1944 under the leadership of WU President Warren E.
Bow (1942-1945). Because of this initiative, in 1945, Graduate Dean John Lee and the WU council
approved the Education Doctorate (Ed. D.) in Administration and Supervision and in Educational
Evaluation and Research (EER). In July 1946, WU President David D. Henry (1945-1952)
authorized the Ed. D. with a major in Administration and Supervision. A few months later, he
authorized the same for the discipline of Evaluation and Research.
In 1949, the first graduate in Evaluation and Research, Dr. Robert Jacobs, was awarded an
Ed. D. A second Ed. D. was then awarded in the same commencement to a student in
Administration and Supervision. According to the 1949 Commencement Bulletin, this was
followed with the awarding of three Ph. D.s to students from other colleges.
The signature pages of the EER Ed. D.s stated, The degree of doctor of education in the
department of education.” This was changed in 1954 when Dr. Wilhelmine L. Haley received an
2
Ed. D. It was the first dissertation to contain implicit language regarding EER: “the degree of
doctor of education in the Department of Education with a major in Evaluation and Research.”
The initial EER faculty was comprised of Profs. Reitz (who later held secondary
appointment in educational Psychology (EDP) and Teacher Education (TED), and Charles L.
Boye. Prof. Joseph W. Menge arrived in 1950 and was included among the EER faculty, but later
was assigned to Secondary School Administration and Supervision and to TED. He later served as
a College of Education Assistant Dean.
In 1953, under WU President Clarence B. Hilberry (1952-1965), additional doctoral majors
were authorized and included Audio Visual Education, Educational Sociology, Elementary School
Administration and Supervision, Curriculum Development, Guidance and Counseling, History and
Philosophy of Education, Industrial Education, Secondary School Administration and Supervision,
and Teacher Education. A petition for Ed. D. in Special Ed was approved in 1945, but this program
area was delayed until 1954. In September 1957, the faculty of COE supervised the awarding of
107 Ed. D.s. By 1960, Evaluation and Research was the second most frequently awarded Ed. D.
in the COE.
Supervision of the first decade of the COE Ed. D. program was provided by Profs. W. Ray
Smittle (Educational Administration), followed by Harold Soderquist (Philosophy and History of
Education), and then Assistant Dean Menge. They also served on many EER dissertations as a
committee member. A club for COE doctoral students was formed led by a member of the COE
graduate students and faculty.
EER
Irwin (1960) stated, evaluation and appraisal keep program alive and in programs (p. 1).
Although she became the first female faculty member in EER, previously she was an EER doctoral
3
student. A major point in her dissertation was to evaluate some aspects of the doctor program at
Wayne State University such as admission criteria and procedures, courses of study, the
examination system, the Non-Education and research techniques requirements, graduates’
assessments in terms of professional, academic, and personal experience as well as post degree
growth; the assessments and projections of graduate faculty and administrative officers; and, the
quality of completed dissertation projects” (p. 1).
Many years later, another EER doctoral student, Ozkan (2008), focused on the students
developments and the need to change the current curriculum. Ozkan (2008) acknowledged there
is a “growing belief in the statistical community the significant change must be realize in the
methodology of statistics education(p. 1). Ozkan (2008) explained statistic education is based
on “developing knowledge and mastery(p. 1), assuming that students will be able to grow an
overall knowledge and understanding of the subject through educational process. Similarly, Hogg
(1991) argued a progress must be made in statistical education because there is a sense that
students are not well prepared for their college-level science and mathematics courses (p. 1).
Ozkan (2008) also noted the importance of statistical consulting because it is very important for
research in government, industry, and universities. Statistical consulting is in demand in
universities because many researchers lack the ability to develop more “efficient statistical
educational methodologies” (p. 5).
A few years later, another EER doctoral student set as their dissertation purpose to “conduct
a program evaluation of the Education Evaluation and Research program at Wayne State
University in the College of Education in order to answer whether its goals and objectives were
being met, “to determine the efficiency of triangulating methods of evaluation,(p. 5), and to
determine the psychometric properties of a Likert scale survey from Wayne State University’s
4
Student Evaluation of Teaching (SET) designed to measure doctoral student’s perspectives of EER
goals and objectives (White, 2015). White (2015) focused on the goals of the EER and asked if
they meet the expectations of a higher education institution. He sought to determine the strengths
and potential weaknesses of the program according to the faculty and the students. White (2015)
also attempted to assess the extent graduates believed they were prepared for their careers. The
data collection tool used was the WSU Student Evaluation of Teaching (SET). Because it was
designed to be used by a class of students with regard to their teacher, White (2015) conducted an
exploratory factor analysis to determine if the SETs could be repurposed, which was a tool for
reliably and validly to determine students' impressions of the EER program area.
Most recently, another EER doctoral student, Carroll (2019), sought to determine the
availability, role, and technical support of expanded uses of software, beginning with statistical
programs but then expanded to database, word processing, graphics, etc., at WSU, an R1
institution, was sufficient to “perform research and administrative tasks(p. 3). Carrol’s (2019)
study was restricted to the WSU COE’s administration and faculty.
Purpose of Study
The historical development of the EER program, from its initiative as WSU’s first Ed. D.,
to its growth with the Ph. D. and master’s program, was well documented (Irwin, 1960). The
importance of the EER doctoral program was established, in terms of its role in the COE, within
WSU, and with outside business and industry communities (Ozkan, 2008). The ascension and
support of non-EER administrators and faculty into statistical and related software was
investigated (Carroll, 2019). A program evaluation established the high level of satisfaction of
graduates regarding their EER doctoral experience (White, 2015).
5
The purpose of the current study is to expand beyond WSU and consider the ranking of
WSU’s EER program as compared with 22 universities designated by a former COE dean as being
comparable to WSU. A narrowly focused program evaluation will consider faculty productivity at
those universities in terms of various aspects of their scholarly publications, and will be compared
with the WSU EER faculty, primarily via the h-index. The next step is to develop a statistically
sound standardized ranking formula regarding the scholarship of the faculties of these universities.
The final aim of this study is to determine a good estimate of the coefficient “a” used in h' (an
estimate of the h-index), specifically for EER-related faulty at the 23 universities.
Significance of Study
There is an interest to rank universities locally, nationally, and globally. According to
Loannidis (2007), The evaluation of the performance of universities and institutions is an
attractive concept if an evaluation is done objectively and accurately (p. 2). Evaluating and
ranking universities will help institutions to attract funding, faculty, and students. It will help to
prioritize environmental, educational, medical, and business research. Van Raan (2005) suggested
that the growth in scientific research in recent years have “reinforced established academic
institutions and created many new ones” (p. 1). Van Raan (2005) indicated at the same time, the
number and intensity of student and researcher exchange programs, international collaboration,
and working stays outside their own country rapidly increased, intensified by the ever- growing
worldwide mobility (p. 1). These events have led to “increase of accountability, evidence of
quality and value for money, and all these developments led to an increasing competition for
financial support and for the best students and researchers between universities within nations and
worldwide” (p. 1). Because of these reasons, universities desire to be a leading institution in their
country or even in the world.
6
Research Questions
1. How does the curriculum of the EER doctoral program at Wayne State University
compare with twenty-two comparable universities?
2. What are the h - index, g index, number of citations, and number of papers produced
by the faculty in the EER or equivalent departments/programs at the twenty three
universities?
3. What is the standardized ranking that captures the hierarchical rank of the twenty-three
EER or equivalent departments/programs?
4. What is the best estimate of a in 
, where hʹ is an estimate of the h-index, for EER
or equivalent department/programs at the twenty-three universities?
Assumptions
Some universities have a formal equivalent to the WSU College of Education’s EER
program area, whereas in other universities the program is either part of a different college, or is
not a stand alone program area (meaning it is a part of another discipline). At some universities,
the equivalence to an EER program may be a loose collection of faculty members whose home
base varies (e.g., faculty members may be based in a psychology, educational administration,
counseling education, and other program areas). Therefore, it is assumed that the soft definition of
an EER program is nevertheless comparable to the formalized EER program at WSU.
Limitations
This study will be restricted to the EER program at WSU, and the twenty-two other
comparable programs at universities identified by former Dean Douglas Whitman in 2017. The
list is neither a nationally representative sample nor are all the universities designated as R1
institutions. A list of R1 and R2 designated universities is presented in Chapter 2.
As will be explicated in Chapter 2, the h-indices will not be verified by confirmation of the
7
faculty member, nor by a comparison with their respective c.v.’s. Therefore, it is likely there will
be a certain amount of error in the h-indices. Sawilowsky (2012) illustrated a common problem
with determining an h-index without faculty collaboration: “Searches for author last names that
are common, transliterated, misspelled, contain diacritical marks, or changed when married may
be problematic” (p. 88).
Definitions
H-index is the number of papers (H) that have been cited at least H times. For instance, H-
index of x means that x papers that are cited at least x times by other research papers.
R1: Doctoral Universities Very high research activity
R2: Doctoral Universities High research activity
8
CHAPTER 2
Literature Review
History of EER Service Courses
All Ed. D. programs at Wayne State University’s College of Education, from their
inception, required a minimum of 12 credit hours in methodology. At least half of this
coursework was in EER’s (a) research and experimental design, (b) statistics courses, or (c)
credit by examination. The remaining credit hours pertained to specialized techniques in non-
EER courses taught in the major area.
Originally (as developed beginning in the 1940s), all Ed. D. programs at Wayne State
University’s College of Education required 12 credit hours in methodology. At least half of the
courses were in EER research and design, or statistics. The other half were created in the various
program areas, or any of those courses could be replaced via examination.
In the 1990s, in response to the Provost, Dean Paula C. Wood created the Doctoral
Academic Standards Committee (DASC). Among its charges were to recommend the type of
methodology courses and the number of credit hours required for the Ed. D. and the Ph. D. degrees.
The recommendation, duly adopted by the COE faculty, was 12 and 15, EER courses respectively.
Although, due to EER curriculum revision in the 2000s, the Ph. D. minimum credit hours was
changed to 14-15.
The EER’s role was (a) to determine which EER courses met the general methodology
requirements (which were most EER quantitative and all EER qualitative courses) and (b) to
provide technical expertise to the COE faculty, and DASC oversight, if specialized techniques in
courses taught by the major would be substituted for EER courses. However, several non-EER
COE courses, duplicative of EER coursework, were grandfathered in. Although the faculty of
9
every COE program area had the freedom to develop courses for their major via the Curriculum
Committee process, the program faculty were required to obtain EER oversight and DASC
approval in order to obtain approval as a methodology course counting toward the research
requirement.
As the COE administrations (i.e., Dean’s Ilmer, Shields, & Whitman from 2011 - 2019)
and their faculty changed, the DASC was replaced by the Doctoral Advisory Committee. The
number of duplicative EER courses in the COE doubled, and then tripled, with most side-stepping
EER oversight. In an email exchange with Prof. Shlomo Sawilowsky in May 2016, Dean R.
Douglas Whitman terminated EER’s role of providing oversight for non-EER methodology course
development, as well as any responsibility for student learning outcomes as it related to
methodology in the dissertation (Sawilowsky, personal correspondence). This essentially returned
the role of methodology in the COE doctoral programs to its original formulation.
Philosophical Doctorate (Ph. D.) in EER
In 1953, the Ph. D. program was approved in Educational Psychology. In 1956, the
Educational Psychology (EDP) faculty participated in a joint Clinical Psychology Ph. D. program
in collaboration with the faculties of the College of Liberal Arts, Business Administration, and the
College of Medicine. In January 1956, Prof. Reitz met with Prof. Jacob Kounin (Ph. D., State
University of Iowa, who joined the EDP faculty in 1946) in an effort to model the rationale for the
Ph.D. in EER after the Ph. D. program in EDP. Also attending this meeting was Prof. Charlotte
Junge (Mathematics Education). This effort culminated in the 1958 petition signed by Prof. Reitz
for a Ph. D. in Evaluation and Research, which was approved by College of Education Dean
Francis L. Rosecrance, who in turn submitted it to the WSU Graduate Council for approval.
10
The first Ph. D. in EER was awarded to Ruth H. Sprague in 1963 (“Learning difficulties of
first grade children diagnosed by the Frostig Visual Perceptual Tests: A factor analytic study”),
and was chaired by Prof. Reitz. The signature page contains the language “Doctor of Philosophy
in Education, 1963, Major: Evaluation and Research.”
EER was joined in the 1958 petition for the Ph. D. degree by Profs. Charlotte Junge and
John J. Lee (Ph. D., The Ohio State University, who joined the faculty in 1936 with an appointment
in Special Education). The three signatories, in pursuit of an expansion of the Ph. D. to other major
disciplines in the COE, represented faculty in Administration and Supervision, Audio-Visual
Education, College Teaching (Humanities, Physical Sciences, & Social Sciences), Curriculum
Development, Educational Sociology, Elementary School Administration and Supervision,
Guidance and Counseling, History and Philosophy of Education, Industrial Education, Secondary
School Administration and Supervision, Special Education, and Teacher Education.
Among the petition’s supporting documents, the College of Educations faculty submitted
a rationale entitled “Essential Differences Between Proposed Ph. D. and Ed. D. Degrees.” It stated,
in part,
The Ed. D. degree…was designed primarily as an advanced professional leadership
degree. The central aim of this program has been, and is, advanced training for
leadership in teaching, supervision, and administration. For this reason, the Ed. D.
has rightly been termed a “practitioner’s” degree. The proposed Ph. D. program
will place heavier emphasis on research and on preparation for college teaching in
the specialized fields of education. The respective features which most sharply
distinguish the two degrees may be found in (a) differences in the research
techniques required, and (b) types of dissertation research projects undertaken
(online).
In the 1990s, the COE faculty, and the members of the DASC, were tasked by the Provost to
provide definitional distinctions between the Ed. D. and Ph. D. In general, with the exception of
the EER faculty, the COE faculty declared it was an eternal mystery, viewing the two doctoral
11
degrees as hopelessly intertwined. In contradistinction, the EER faculty, (first documented in the
1996 Version 1 of the EER Brochure and through the 2015 Version 17), differentiated between
the Ed. D. and Ph. D, as indicated in the EER Student Handbook (Sawilowsky, 2018):
Policy Statement on Doctoral Dissertations. The Ph.D. requires a dissertation which
makes an original contribution to the literature on quantitative or qualitative
research design, applied statistics/data analysis,
measurement/testing/psychometrics, or program evaluation. Therefore, the Ph.D.
dissertation conforms to the rigors of scientific inquiry on theoretical issues, with
simulation or empirical demonstrations for illustrative purposes. The Ed. D. is the
practitioner’s highest degree. Based on a needs analysis, the Ed. D. dissertation
centers on field or applied studies, such as the determination of best practices (p.
14).
Historical Details on EER
Compiled in the Appendix is a list below of various aspects of the EED doctoral
program, including:
EER faculty who currently, or previously, chaired an EER doctoral (Ed. D. or Ph.
D.) dissertation. The professional rank given is either the current or final rank.
COE Faculty who taught/teach in EER and serve(d) on EER dissertation
Committees)
Former EER faculty, including those who served on EER dissertation committees
(Tenure/Track, Clinical Assistant Professor)
EER Adjunct Instructors who serve(d) on EER dissertation committees
EER Graduates who have served on EER dissertation committees
EER dissertation committee membership, including a table Breakdown of
Affiliation by Committee Member Position
EER Doctoral Dissertations, with tables on "EER Doctoral Dissertations, 1949 -
2019", "EER advisor by Number of Dissertations", "EER Advisor by Doctoral
Degree Type", and a figure with a regression line figure superimposed on the
"Number of EER Dissertations per year from 1949 - 2019.
Previous Reviews of EER, COE, and WSU by EER Doctoral Students
Among the areas of discussion in the 231 dissertations to date, four were based on program
evaluations of the EER program and on EER type competencies in the university/college. These
dissertations were by Irwin (1960), Ozkan (2008), White (2015), and Carroll (2019). A brief
synopsis of each of these four EER dissertations follows.
12
Irwin (1960)
Irwin (1960) conducted an evaluation to determine the strengths and weaknesses of the
Doctor of Education (Ed. D.) program in EER at Wayne State University. It was suggested an
examination of the admission procedure would be beneficial, as well as course of study,
examination processes system, the student’s growth in terms of academic and post degree growth,
and the quality of writing the dissertation. In order to determine the effect of the program
admission, academics curriculum, and student’s growth, Irwin (1960) inquired and analyzed the
official transcript of the program graduate, questionnaire, recorded faculty interviews, and
examined graduate quantitative and qualitative dissertations.
The objective was to find evidence that would help to promote and implement
improvement in the program and in higher education learning institutions. The secondary purpose
was presenting an historical count of the development of the Doctor of Education program at
Wayne State University and to present a description of the Doctor of Education program as
described by the University. Irwin (1960) stated an effort to describe the larger picture of the
Doctor of Education program nationally and internationally applied to the study.
The transition from local to state control at WSU took place under the direction of Dr.
Lloyd Allen, Vice President of the Graduate School, and then WSU President Clarence B.
Hilberry. At that time, a committee was established to pursue this and one of the areas of study, as
Irwin (1960) noted was the “Evaluation of the Master and Doctoral programs” (p.4). It was clear
in the 1960s, the review of the doctoral program was necessary because of the new status of Wayne
State University as a state higher education institution and to provide the latest training to its
students.
13
According to Nelson (1953), the differences between Ph. D and Ed. D degrees were not
significant. However, the consensus of the faculty at that time was that the Ph. D was more
rightfully oriented as an academic and research degree, whereas the Ed. D. was more specialized
and intended for individuals interested in pursuing positions of educational leadership individuals
(Irwin, 1960).
Irwin (1960) noted that Dean Waldo B. Lessenger was instrumental in pursuing the degrees
of Ed. D in Education and in Evaluation and Research. Dean Waldo B. Lessenger formed a
committee of external and internal consultants and invited them to survey the facility and to advise
and recommend matters related to the Doctoral Degree in Education and Evaluation and Research.
In June 5, 1945, the Graduate Council approved the request unanimously. The recommendation
was authorized by WSU President Henry on July 18, 1946, and the program was to start September
1946.
Irwin’s (1960) evaluation excluded students who withdrew during the program or students
who were still involved in the process of dissertation writing. Additionally, budgetary and cost
issues within the program were not considered in the evaluation.
There were three assumptions made by Irwin (1960). First, individuals who actively
participated in the Doctor of Education program faculty, staff, and graduate students were
qualified to judge the strengths and weaknesses of the program. Second, the information and the
data used to evaluate the strengths and weaknesses of the Doctor of Education program were
graduate transcripts. Third, evidence of growth and achievements of graduate students could be
evaluated by assessing the quality of completed dissertations within the Doctor of Education
program at Wayne State University. Irwin (1960) divided the analysis of the data in four parts:
14
methodology of transcript analysis, methodology of recipient’s questioner schedules, methodology
of faculty interview, and methodology of dissertation analysis.
Irwin (1960) inspected five transcripts and derived 39 variables. In addition, for the
personal variable for each course, 22 other variables were located using the data at hand. Using
the data, a coding system was developed that helped to initiate a pilot study for analysis of ten
additional transcripts and corrected some issues that were discovered. After developing the
necessary codes, Irwin (1960) analyzed 135 transcripts of exclusively Ed. D. who had received the
degree since November 1, 1958. Several issues were noticed such as incomplete information,
questionable information, lack of the organization of permanent files, and when the data was
selected from the official transcripts. An effort was made to collect more reliable information;
however, in some cases, this was difficult.
Irwin (1960) identified some concerns that were important to the doctoral degree recipients.
They were developed into a questionnaire in which respondents were permitted to contribute some
of their own ideas. Irwin (1960) observed concerns in the goal of the programs. First, how the
study was facilitated and whether or not the graduates were given opportunities to share their
knowledge and skills during the program. Second, the credit that applied to evaluate the doctoral
students such as course work and qualifying exams. Third, study aids and assistance such as
adviser and physical facilities. These queries were then examined again and were given to faculty
members for review. The questionnaire was designed to help the participants check their answers
and reply to open-ended questions.
Faculty member interviews were conducted based on a designed questionnaire. Its purpose
was to assess the process of admission, the course study, the dissertation research projects, the
examinations process, and the general estimate of the program value. The faculty consisted of
15
administrators of the graduate school, members of the graduate faculty of the college of education
who had earned their degree from Wayne State University, and regular, full-time graduate faculty
who sponsored one or more doctoral candidate through their oral qualifying examination. Irwin
(1960) interviewed 41 faculty members who were not given the question before hand. Analyses of
the faculty members response were related to the following: sex, type of degree held, professorial
rank, institution from which higher rank achieved, years in service in the college of education, and
years of dissertation direction at Wayne State University.
Irwin (1960) developed a procedure to collect and evaluate the importance of graduate
dissertations, the instrument designed to measure the quality of the Doctoral students who received
their degree by November 1, 1958. Out of 136 dissertations only 100 were available at the time of
the inquiry. Two types of dissertations were examined, qualitative and quantitative. The qualitative
dissertation assessed for three points: the problem of theoretical and practical significance,
procedures of the selection of methodology, and the treatment of the data and presentation which
checked the utilization of technique of effective writing. Quantitative properties were evaluated
for the following: number of pages, words, figures, tables, and etc. The first step of preparing the
instrument was given to 35 doctoral students, each of them was required to read a dissertation and
to evaluate it by responding to it and writing their ideas. About 100 dissertations were evaluated
by doctoral students. Mean and standard deviation were calculated for 31 qualitative questions and
relatability coefficient were computed for the qualitative and the quantitative out of 27
dissertations.
Irwin (1960) provided findings based on four topics selected to analyze the Doctoral
preprogram at Wayne State University. These included, “Official graduate transcripts of degree
graduates, questioner schedules sent to degree recipients, recorded faculty interviews and rating
16
procedures employed to judge quantitative and qualitative characteristics of dissertations(p. 558).
According to Irwin (1960), the purpose of this investigation was to explore the “nature, structure,
purpose, and operation of Wayne State’s doctoral program in order to locate quality clues which
would clarify and provide a better basis for understanding through evaluating means(p. 559).
Irwin (1960) intended to find points that eventually would lead to improvement of the program.
Most of the graduates earned their baccalaureate degree at Wayne State University, which
indicates the University assisted the local population as a public institution. A large percentage of
the undergraduates completed their degree at out of state institutions which contributed to the
diverse views of the Doctoral Program in the College of Education. The average graduate GPA
was 3.47 between earning a master’s degree and the time of application for the Doctoral program.
In general, the results were positive with some differences between the 1956 group and the 1958-
1959 group. The doctoral program at Wayne State University was described as “outstanding” over
other institutions. In another statement, Irwin (1960) described the comments about the Doctoral
program at Wayne state as “entirely worth the time, effort and expenditure of money” (p. 579).
Irwin (1960) recommended the examination of all procedures as well as the quality of the
final exams for the graduate at the College of Education in the Doctoral program. Another
conclusion was the way other students and departments at Wayne State University perceived the
Ed. D. and the Ph. D. programs in the College of Education, as compared with Liberal Art’s parallel
degrees. Irwin (1960) described the views and the attitudes of people toward the degree earned at
the College of Education as significantly less and asked for an increase in the level of quality
professional training in order to create a greater level of respectability to the Doctoral program.
Irwin’s (1960) recommendations were implemented over the years. Some of them
include, informing graduate students of job opportunities, observing graduate students in their
17
written exams, and explicit distinctions between the dissertations of Ed. D. and the Ph. D. in the
EER program.
Ozkan (2008)
Ozkan (2008) was concerned with the proper education and statistical training of pre-
college students as well as higher education preparation. It was hypothesized a gap exists between
statistical and methodological skills students obtain and their subsequence performance in a
research effort. Ozkan (2008) suggested researchers have difficulties determining which strategy
and methodology to use and what statistical tools should be used in research. Students should have
statistical knowledge to face real life problems.
Ozkan (2008) also considered the role and the function of the statistical consulting center
at WSU. The responsibility of a statistical consulting center is to provide assistance in research
planning such as finding the right tools for a specific research, gathering the data, analysis of the
data, and suggestions for which software program to use and more. The statistical consultant must
be able to provide the necessary statistical assistance to various department in the higher education
environment. Therefore, the education of statistical students must change to fit the requirement
that statistics students face in real life today.
White (2015)
White’s (2015) purpose was to “conduct a program evaluation of the Education Evaluation
and Research program at Wayne State University in the College of Education in order to find out
whether the College of Education is achieving its goals and objectives were being met” (p. 5)
based on Wayne State University’s Student Evaluation of Teaching (SET). He also assessed the
ability of the program to produce desired or intended results by interviewing the faculty and
18
surveying the students in the program for their opinions as to whether they belive that they are
prepared for their career” (White, 2015, p. 6).
White’s (2015) intent was to find out to what degree blended methods can be successful
when applied to a program evaluation of a university doctoral program”? (p.6). Stufflebeam
(2001) indicated that the “use of both quantitative and qualitative methods is intended to ensure
dependable feedback on a wide range of questions; depth of understanding particular programs;
a holistic perspective; and enhancement of the validity, reliability, and usefulness of the full set
of findings” (p. 40). He noted, The focus of any evaluation is either formative or summative or
a blended version of both” (White, 2015, p. 18). A formal evaluation is conducted during the time
that the program is taking place, while summative evaluation is required when assessing the
efficacy of the program as it conveys particular goals. According to White (2015), a mix of
formative and summative evaluation are used when there is a need to make a decision regarding
the program; however, in the case of the EER evaluation, there, the intention was to define the
goals and objectives of the program. Consequently, the summative evaluation was sufficient.
White’s (2015) evaluation “encompassed [a] data driven approach(p. 35) which was
required in order to analyze qualitative data. There were “four types of analysis that were
conducted with the data from the interviews” (Spradley, 1980): domain analysis, taxonomic
analysis, componential analysis, and thematic analysis. There were two separate interviews that
were conducted with two full-time faculty members.
White (2015) conducted two faculty interviews in the EER program at WSU. One professor
indicated that the goal was to “provide my students with quantitative tools and to enable them to
do research in multiple areas” (p. 105). The other professor added that a major goal was to
“produce quantitative and qualitative methodologist in and outside the discipline of education” (p.
19
105). Based on the interview of the two professors, it was evident there were concerns regarding
the viabilty of the EER program during the administration of the two College of Education Deans,
concluding with the Spring/Summer semester of 2018. The faculty interviews led to the conclusion
that those administrators did not value the EER program, its faculty, curriculum, or achievements
of its graduates.
Nevertheless, the faculty scholarly and teaching accomplishments were paramount in the
college during those two administrations (e.g., two EER faculty members swept the college
teaching and scholarship awards in 2017) based on superior quality and quantity of publications,
presentations, and grants; and program graduates held an enormous footprint in the field (e. g., the
collective h-index of graduates was 37.7), with about 20% holding faculty positions..
Student responses to White’s (2015) survey indicated the grading was considered “fair and
adequate” (p. 110), instructors provided helpful feedback, and the curriculum was highly rated in
terms of preparation for their careers. Based on this information, White (2015) concluded that the
EER program was successful in producing scholars who are capable of publishing research (p.
112).
Carroll (2019)
Carroll (2019) suggested that there is a need to examine the importance of technology in
the higher education environment. Although he found considerable literature on the benefits of
computer usage by students in graduate programs, it was unclear how R1 faculty employ
computers and software in teaching and scholarship, particularly with regard to selection,
installation, and obtaining technical support. Carroll’s (2019) survey provided a greater
understanding of the cognitive process professors invoke in making computer and software
choices, particularly with regard to the rigors expected at R1 universities. Surprisingly, a large
20
percentage of the faculty did not require computer and software services for online course
development, but that finding was likely to be attributed to a lack of a college-based policy at that
time on hybrid and online course degree programs and individual course offerings.
University Ratings
Obama Administration
President Obama (2013) disapproved of the rising cost of college tuition. In a speech to
students at the University of Buffalo, he noted the cost of tuition led to many in the middle class
with few options for affording higher education. From a consumer’s perspective, he opined the
sparse dollars a family had available for higher education would be maximized if there was a
national, universal university ranking system, which would inform parents and students by
comparing universities and colleges in terms of economic efficiency. Moreover, such a ranking
system would be used in the allocation process of federal student loans and grants.
Obama’s (2013) suggestion was opposed by many university presidents and their faculties,
as well as the American Federation of Teachers and politicians. Based on the experience of the
former, it was noted formulas pertaining to tuition and need-based aid are complicated, as are many
less tangible albeit economic conditions necessary to complete a four year curriculum. The latter
claimed imposing federal standards on higher learning institutions would interfere with the private
sectors innovativeness and creativeness. There were many flaws in the rating system. For example,
one key marker of success is if the colleges graduates earned at least twice the federally set
povertery line, which would adversely impact the ratings of religious institutions and military
schools. It would adversely impact community colleges with inclusive open-access goals. Lomax
(2015), a former president of the historically black college Dillard University, and president of the
United Negro College Fund, opined the ratings would be racist (as did many others), in that it
21
would favor Harvard with “a $36 billion endowment and enrolls academically elite students,
whereas “Dillard has a $49 million endowment and enrolls many students who are not as
academically prepared for college as their more advantaged peers.President Obama abandoned
the rating system two years later.
US & News World Report College Rankings
US News and World Report (https://www.usnews.com/education/best-graduate-
schools/articles/how-us-news-calculated-the-rankings) is a commercial endeavor that sells a
product that ranks a variety of educational systems (e.g., high schools, undergraduate colleges,
graduate schools). Their ranking system is based on annual surveys sent to over 2,000 university
programs, with over 22,000 individual responses. Curriculum, student-faculty ratio, job placement,
GRE Scores, and similar factors are queried in business, education, engineering, law, medicine,
and nursing. After proprietary weighting, the survey responses are used to produce a standardized
value, where all rankings are keyed to the top scoring program which is assigned a value of 100.
The rankings are presented with tied values reported alphabetically.
There were a variety of limitations to their survey approach. For example, McGaghie and
Thompson (2001) noted the survey’s “(1) narrow focus, (2) inadequacy of response rates, (3)
measurement error, (4) unchanging stability of results, and (5) [there are] confounding [variables]”
(p. 987). Although the surveys were modified frequently over the years, a common theme is the
underlying notion of ranking appears to be geared to ensure that Ivy League (and similar) schools
remain premiere institutions by weighting heavily, for example, the amount of alumni donations.
Similarly, a heavily weighted factor was size of endowment. Harvard’s and Yale’s endowment,
with over $35 and $25 billion respectively, forced their top tier ranking due to the economic
acumen of their endowment director, which is a factor largely irrelevant to scholarship.
22
Similarly, the number of Harvard trained professors at the University of Michigan, for
example, exceeds the number of Harvard-trained professors at Harvard. It is true award-winning
work such as the Nobel prize and field awards that measures the quality of research. Nevertheless,
it is uncertain why universities with a Nobel prize winner and field awards are those who provide
the best education? Shown in Table 5 are the Nobel prize winners between the years 1997-2006
and indicate where they made their award-winning work and where they are currently working.
According to Ioannidis et al. (2007), out of 22 universities only 7 faculty members did their
groundbreaking work at the same university with which they were affiliated when they received
the award; Therefore, this measurement addresses the ability of institutions to attract prestigious
awardees rather than being the site where groundbreaking work is performed. Finally, most
institutions have no such awardees. Thus, such criteria can rank only a few institutions” (p. 3).
Table 1. Nobel winners in Medicine/Physiology for 19972006: affiliation at the time they did
the award-winning work and at the time they were given the Nobel Prize
Name
Year
Affiliation (Nobel work)
Affiliation (Nobel award)
Fire AZ
2006
Carnegie Institute, Washington
Stanford University
Mello CC
2006
University of Massachusetts
Same
Marshall BJ
2005
Royal Perth Hospital, Australia
University of Western Nedlands
Warren JR
2005
Royal Perth Hospital, Australia
Perth, Australia
Axel R
2004
Columbia University
Same
Buck LB
2004
Columbia University
Fred Hutchinson Cancer Research
Center
Lauterbur PC
2003
SUNY Stony Brook
University of Illinois
Mansfield P
2003
University of Nottingham
Same
Brenner S
2002
MRC Molecular Biology Unit,
Cambridge
Molecular Science Institute, Berkeley
Horvitz HR
2002
Cambridge University
MIT
Sulston JE
2002
MRC Molecular Biology Unit,
Cambridge
Sanger Institute, Cambridge
Hartwell LH
2001
Cal Tech
Fred Hutchinson Cancer Research
Center
Hunt RT
2001
Cambridge University
Imperial Cancer Research Fund,
London
Nurse PM
2001
University of Edinburgh
Same
Carlsson A
2000
University of Lund
Gteborg University
23
Greengard P
2000
Yale University
Rockefeller University
Kandel ER
2000
Columbia University
Same
Blobel G
1999
Rockefeller University
Same
Furchgott RF
1998
SUNY, Brooklyn
Same
Ignarro LJ
1998
Tulane University
UCLA
Murad F
1998
University of Virginia
University of Texas
Prusiner SB
1997
UCSF
Same
Carnegie Mellon
Andrew Carnegie was an immigrant from Scotland that settled in Pittsburgh, PA., in 1848.
Carnegie established the world’s largest steel producing company by the end of the 19 centuries.
In 1900, he donated $1 million for building a technical institute for the city of Pittsburgh, predicting
that the school would help working-class men and women. The Carnegie school offered a two-
and three-year certificate institute.
Throughout the 20
th
century, particularly after World War II, Carnegie Tech went through
a dramatic change. In 1956, the arrival of the first computer set the foundation for a university with
culture, where information technology ensued almost in all fields of study. In 1967, Carnegie
merged with the Mellon Institute, which was a science research center founded by the Mellon
family of Pittsburgh, to become the Carnegie Mellon University. The Carnegie Mellon University
became a national and international leader in higher education. It has branches all over the world
in places such as Qatar and Silicon Valley, California.
The Carnegie Mellon University published a Basic Calcification Description (1973) and
later updated and replaced the original traditional classification framework which was developed
by the Carnegie Commission on Higher Education between the years 1976-2018. According to the
most recent update, the Doctoral Degrees Universities were modified to better accommodate the
Doctor’s Degree professional practice (2018). The Carnegie Mellon Doctoral Universities includes
more than 20 research/scholarship doctoral degrees between the years 1976-2018 and gave more
24
than 30 professional practice doctoral degrees in at least two programs. This did not include Special
Focus Institutions and Tribal Colleges. The two categories only included organizations that were
awarded more than 20 research/scholarship doctoral degrees and had above $5 million in total
research expenses (Higher Education Research & Development Survey (HERD). The following
are the R1, R2, and universities ranking definitions according to Carnegie Mellon University: R1:
Doctoral Universities Very high research activity, R2: Doctoral Universities High research
activity, D/PU: Doctoral/Professional Universities. A list of Carnegie Mellon R1 and R2
Universities as of January 2020
(https://carnegieclassifications.iu.edu/classification_descriptions/basic.php).
Twenty-three Universities Included In This Study
There were 22 universities identified by former Dean Douglas Whitman as being
comparable to the WSU COE’s EER program. No methodology was given on how they were
determined. They are compiled in Table 2 below.
Table 2. Identification of 22 other Universities Apart from WSU
1. Boston College
13. University of Florida
2. Brigham Young
14. University of Illinois Chicago
3. Claremont Graduate University
15. University of Illinois Urbana
4. Colombia
16. University of Iowa
5. Florida State University
17. University of Kentucky
6. George Mason University
18. University of Tennessee
7. Kent State
19. University of Virginia
8. Ohio State University
20. University of Washington
9. University California Berkeley
21. Washington State University
10. University North Carolina Greensboro
22. Wayne state University
11. University of Boulder Colorado
23. Western Michigan University
12. University of Connecticut
25
h-index
The h-index is commonly used to characterize the scholarly productivity of university
faculty. Sawilowsky (2012) noted the h-index suffers from numerous limitations. Suppose a
scholar has two publications. If they were both cited only once, the h-index is 1. If one was cited
once and the other being cited twice, then the h-index for that scholar is 2. If one was cited once
and the other was cited 1,000 times, the h-index remains at 2. The rapidity with which the h-index
may be calculated via search engines (e. g, scholar.google.com) has contributed greatly to its use.
However, there are a plethora of problems with this statistic, as Sawilowsky (2012) noted:
(1) Sometimes work is highly cited because it is wrong. (2) The number of publishing
outlets is related to the number of scholars in the field, favoring certain disciplines. (3)
There is no differentiation between exploration and explication. The same issue in
Psychological Bulletin that I published a new knowledge article that has been cited
160 times also contains a statistics primer for dummies by Jacob Cohen (1923 1998,
h-index = 62) that has been cited 8,547 times. (4) Credit is given in the index for a
citation even if it supports a position contrary to the publication. (5) These indices can
change extremely quickly. My -h, defined 5 as the number of additional citations of
specific publications that will change my h-index from 19 to 20, is only 3 additional
citations Shlomo S. Sawilowsky 88 of the 20th most cited publication. (6) These
indices can change extremely slowly. Some editors prefer authors to cite recent,
secondary references to seminal work instead of the original, not only because it makes
the literature review look fresher, but as time passes it becomes difficult to access
seminal work. (These are different reasons from that invoked by Wikipedia, which
relies on secondary sources to enable equal participation of editors who are completely
devoid of any substantive knowledge in the field.) Also, well known methods are
rarely referenced, such as Karl Pearson’s Chi-Squared test, Student’s t-Test, or
Wilcoxon’s Rank-Sum test. (7) Disciplines where the scholarly outcomes are lengthy
treatises, qualitative, or juried exhibits or performances will never be equitably served
by formulae based on numbers. Scholarship in the form of plenary or keynote
addresses before scholarly societies and professional associations that are not
abstracted or subject to proceedings, scholarship serving as the basis for legislative
language, and expensive and extensive literature reviews found in technical reports
from federally funded peer reviewed grants (e.g., the United States Department of
Education, National Science Foundation, National Institutes of Health) will not be
captured by these indices. Although the software programs listed above permit
searching for patents and post non-peer law review publications that are eventually
cited in judicial decisions, these forms of scholarship are generally not cited with the
same frequency as found in other disciplines. There are additional problems if the
index is based on a quick and cheap Google Scholar search. (1) Google Scholar doesn’t
26
differentiate between peer and non-peer reviewed publications. It includes citations
from self-published books and editorials. (2) Publications not on the internet cannot
be found. Sometimes, even if they are on the internet, they are inaccessible because
they require membership login, don’t use Google Scholar’s required html <meta>
commands, or exceed Google’s five megabyte per document limitation. (3)
Posthumous re-publication causes inflation. For example, Pearson’s Tables of the
Incomplete Beta-Function” was republished 29 years after his death and has 615
citations, and “The life, letters, and labours of Francis Galton” was republished last
year and already has 91 citations. Google Scholar treats these posthumous re-releases
of his earlier work as new publications. (4) Searches are often not replicable, because
results are based on a random set of 1,000 hits. Google will (at least temporarily)
suspend privileges if too many searches are conducted within a short timeframe
exacerbated by not publically disclosing (a) the maximum number of searches that
may be conducted (b) within what timeframe that will trigger a suspension and (c) for
how long the suspension will remain in effect for a given ip address. (5) Searches for
author last names that are common, transliterated, misspelled, contain diacritical
marks or changed when married may be problematic. (p. 87-88)
Sawilowsky’s S-Index
Sawilowsky (2012) developed a comprehensive scholarly index, based on compliments
and adjustments to the core h-index. The first component is to weight the h-index as
( )
( )
1
1
1
PW-index
2
( 1)
ii
ii
h
AJ
i
h
i
i
h
AJ
i
hh
R
hh
hh
=
=
=
=
=
+
. (1)
where h
A
refers to the author’s h-index, h
J
the publication’s h-index, and R is the rank. This yields
a PW-index, or publications weighted index. To handle the excess citations ignored in computing
the h-index, two new components were added:
3
EC
P -index
h h h
MIN
C C C C
N N N N= +
(2)
and
3
EP
P -index
P
Nh=−
(3)
27
where
h
C
N
is the number of citations for articles used to compute the h-index and
h
MIN
C
N
, and
P
N
is the total number of publications. Additional faculty-based components include the number of
doctoral students, number of co-advising/second advising, minor advising, doctoral committee
member advising, and post-doc advising:
1
S D=
, (4)
where D is the number of doctoral dissertations,
22
S,
Co
DD=+
(5)
where D
Co
and D
2
refer to the number of doctoral students Co-Advised and Second Advised,
respectively,
3
3
S,
M
D=
(6)
where D
M
refers to the number of students served as the Minor Advisor,
4
1
1
S,
i
N
i
O
D
=
=
(7)
where D
O
represents ordinary committee member, and
3
5
S,
P
D=
(8)
where D
P
refers to postdoctoral students..
An additional component is the contribution of the scholar’s doctoral students’
publications,
33
2
DS-index ,
2
D Co M P
N N N N N
a
+ + + +
(9)
where N refers to the number of their publications. For general purposes, a = 1, based on

(10)
28
where N
c
is the number of citations and a is a scaling factor obtained through solving a power
function for empirical data (see Sawilowsky, 2012).
The final component was based on the scholar’s service in editing and reviewing at peer-
reviewed publication outlets. This component was defined as:
6
1
1
S
N
i
i
EB
R
=
= + +
, (11)
where E refers to editorship of peer-reviewed journals; B is editorial board membership, and R
refers to serving as an ad hoc reviewer.
Assembling all these components led to Sawilowsky’s (2012) comprehensive S-index:
33
2
11
11
S-index .
i
NN
C M P
ii
Oi
h D D D D D E B
DR
==
= + + + + + + + + +

(12)
Note there were other elements discussed by Sawilowsky (2012), but not included in the
S-index, such as number and magnitude of extramural federally funded refereed grants or
mentoring success with junior faculty.
Ranking Based on Normalized Scores
Solomon and Sawilowsky (2009) considered the impact of various rank-based normalizing
transformations on test score accuracy. Based on Monte Carlo methods, they compared the Blom
(1954, 1958), Tukey (1962), Van der Waerden (1952, 1953), and Rankit (Bliss & White, 1956)
approximations. The Rankit procedure was superior regardless of sample size with real data
(Sawilowsky, Blair, & Micceri, 1990) for measurement purposes. However, in the current
statistical application where measurement error is not considered, and therefore, the Blom
transformation will suffice.
Chapter 3
29
Methodology
The purpose of this study is to canvas 23 universities identified by former Dean Doug
Whitman, which have evaluation and research programs at the doctoral level which are comparable
to the EER program in the College of Education at Wayne State University, for the purpose of
ranking their scholarly output in terms of their respective facultiesh-indices. The EER program
emphasizes four disciplines: (1) research and experimental design, (2) psychometrics and testing,
(3) applied statistics, and (4) program evaluation. Although the EER program has a fully integrated
qualitative curriculum, for the purposes of this study, only the quantitative track will be considered.
Since 1949, the EER program was housed in the Division of Theoretical and Behavioral
Foundations. However, in the Fall of 2019, the program was transferred to the Division of
Administrative and Organizational Studies. Similarly, the comparable programs at the other 23
universities are known to be housed in a variety of different colleges, departments, or their
equivalent. Programs that are broader, such as inclusion of faculty teaching performance,
personnel, and product evaluation, will be excluded.
Every program area is staffed by tenured, tenure-track, clinical, research, and full time
lecturers. Each of these job titles will be included. However, adjunct instructors, courtesy
appointments, graduate teaching assistants, etc., will be excluded. Although this study is focused
on doctoral programs, faculty who exclusively teach at the Master’s level will be included.
However, faculty who primarily teach at the undergraduate level will be excluded.
Pilot Studies
Several pilot attempts were conducted from 2016 2018 with the assistance of EER faculty
in order to determine the feasibility of this study. For example, initially, the Sawilowsky S-index
was considered as the criterion of faculty scholarship. However, an attempt to compute the S-index
30
on selected faculty in the College of Education at WSU revealed many components of the S-index
are not discoverable by a simple search engine. Although the S-index is far more comprehensive
than the h-index, it was deemed beyond the scope of this study to include as the dependent variable.
Attempts were made to get permission from the Division Assistant Dean to survey selected faculty,
with the task to self-compute the S-index, the obstacles raised were many and insurmountable.
The next pilot attempt, based on scholar.google.com search via Harzing’s (2020) Publish
or Perish, was based on the simple h-index. (Harzing’s (2020) program also permits searches based
on Crossref, Microsoft Academic, Scopus, and Web of Science, but only if the user has
subscriptions to those services.) The plan was to produce a printed copy of the Harzing’s (2020)
Publish or Perish software results, submit it to selected faculty, and have them delete publications
by authors with the same name, or make any other corrections. Unfortunately, the obstacles raised
by the Division Assistant Dean were many and insurmountable. Hence, it was determined if
participation was not forthcoming at the home institution, it would be even less likely to be possible
at any of the other 23 universities.
Study Protocol
The study protocol was to obtain the h-index via Harzing’s (2020) Publish or Perish, and
then examine the results to remove obvious incorrect entries. Examples may include entries with
the faculty member’s correct surname, but different given names; publications prior to the faculty
member’s date of graduation; publications in a field unrelated to the faculty member’s area of
expertise; and similar anomalies. It is understood, therefore, ambiguities in the search engine
results represent inaccuracies, which is a limitation in the study protocol.
Dependent Variable
31
The dependent variable will be the h-index, as adjusted, from Harzing’s (2020) Publish or
Perish software. The adjustments will be made based on the considerations mentioned above.
Independent Variables
The information to be captured from each university’s comparable faculty will be the
number and classification of faculty (e.g., rank) and the number of peer-reviewed publications.
Note scholar.google.com, the search engine that serves as the basis for Harzing’s (2020) Publish
or Perish which will be used in this study, loosely defines publications. It may include non-peer
reviewed outlets. Those entries will be ignored.
The information will be obtained by a physical search of the universitieswebsites. In order
to increase accuracy, the faculty member’s actual c.v. will be used if it happens to be posted on
their website and is dated within six months of search, in lieu of the Harzing (2020) Publish or
Perish search results.
Ranking Method
The Blom transformation will be used to standardized the rankings. It is a normalized
proportion estimate with (μ = 0, σ = 1)
based on
( )
1
ii
r r w
, producing a
Z score
via r
i
1
3
8
,
1
4
i
r
w


=


+

where
1
(
) = the
Gaussian cumulative density function,
w = sum of the case
weights,
and r = rank.
32
An approximation to the h-index is 󰂢

, (13)
where N
c
refers to the number of citations, and a is a constant found via a power function.
Sawilowsky (2012) found the use of a = 1 for the faculty in three of the four Divisions in the
College of Education at WSU was satisfactory (r = .92), but systematically under-estimated faculty
in the Division of Kinesiology, Health, and Sports Studies (r = .48). Both values were considerably
less than Hirch (2005)’s recommended use of values between 3 5. Therefore, an ancillary
outcome of this study will be to estimate a based on (a) the EER comparable faculty for each
university separately, and for (b) EER comparable faculty taken collectively across all 23
universities.
In order to estimate the value of a that maximizes the correlation between hʹ and h, the true
h must be known. As mentioned above, in various pilot studies, it was not possible to obtain blanket
permission to receive the c.v.’s of the WSU COE faculty. However, during this pilot, it was
determined that a certain portion of faculty at WSU’s COE, as well as faculty at the other 22
universities, voluntarily published their c.v.’s on their faculty webpage. Although this is a self-
selecting, nonrandom sample, which is a limitation in general, it is the best available evidence of
the true h index. Therefore, in order to estimate a for EER type faculty, the correlation will be
conducted using the c. v.’s for all faculty included in the study who provided it on their webpage.
Descriptive statistics (e. g., counts, percentage of department) pertaining to those who did and did
not will be collected and presented in the following chapter.
33
Chapter 4
Results
There were four research hypotheses investigated:
1. How does the curriculum of the EER doctoral program at Wayne State University compare with
twenty-two comparable universities?
The following Tables 3 - 25 comprise the curriculum of the 23 universities. The course maps
are compared with the EER course map, the local course number, and an indication of any unique
courses offered at the institution. Note: this information is based on the university website or url
provided for the courses. The frequency each course is offered, and even if they are currently being
offered, was not determined.
Table 3. Boston College Measurement, Evaluation, Statistics, and Assessment
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
MESA 7468 Introductory Statistics
2. z/t test
X
MESA 7468 Introductory Statistics
3. Anova/Ancova/Regression
X
MESA 7469 Intermediate Statistics
4. Multivariate
X
MESA 8668 Multivariate Statistical Analysis
5. SEM
X
MESA 8670 Psychometric Theory II: Item
Response Theory
6. Classical Measurement
X
MESA 8669 Psychometric Theory I: Classical
Test Theory and Rasch
7. Modern Measurement
X
MESA 8670 Psychometric Theory II: Item
Response Theory
8. Research Design
X
MESA 7460 Interpretation and Evaluation of
Research
9. Quantitative Program Evaluation
X
MESA 7466 Evaluation Practice & Methods
10. Qualitative Research
X
APSY 8851 Design of Qualitative Research
11. Qualitative Methods
X
MESA 8864 Survey Methods in Educational and
Social Research
12. Qualitative Program Evaluation
X
852
13. Other (e.g., Sampling)
X
34
Table 4. Brigham Young Educational Inquiry, Measurement and Evaluation
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
CPSE 629 or IP&T 629: Introduction to Research
Methods in Education
2. z/t test
X
CPSE 651 or IP&T 651: Introduction to Statistical
Inference
3. Anova/Ancova/Regression
X
IP&T 745 or CPSE 745: Statistics 2: Multiple
Regression
4. Multivariate
X
5. SEM
X
IP&T 747/CPSE 747: Structural Equation
Modeling
6. Classical Measurement
X
IP&T 752: Measurement Theory
7. Modern Measurement
X
IP&T 754: Item Response Theory
8. Research Design
X
IP&T 674R: Quasi-experimental Research Studies
9. Quantitative Program Evaluation
X
IP&T 761: Program Evaluation in Education
10. Qualitative Research
X
IP&T 653 or CPSE 653: Quantitative Research
Methods
11. Qualitative Methods
X
IP&T 753R/CPSE 753R: Qualitative Research 2
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
IP&T 730/CPSE 730: Hierarchical Linear
Modeling
Table 5. Claremont Graduate University Evaluation & Applied Research Methods
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
2. z/t test
X
PSYCH 308A Intermediate Statistics
3. Anova/Ancova/Regression
X
PSYCH 315 Analysis of Variance
4. Multivariate
X
5. SEM
X
6. Classical Measurement
X
7. Modern Measurement
X
8. Research Design
PSYCH 302A Research Methods
9. Quantitative Program Evaluation
X
PSYCH 326 Foundations of Evaluation
10. Qualitative Research
X
11. Qualitative Methods
X
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
35
Table 6. Columbia Educational Measurement, Evaluation, & Statistics
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
HUDM 4120 Basic concepts in statistics
2. z/t test
X
HUDM 4122, 4125 Probability and statistical
inference
3. Anova/Ancova/Regression
X
HUDM 5122 Applied regression analysis
4. Multivariate
X
HUDM 6122 Multivariate analysis I,
5. SEM
X
HUDM 6055 Latent structure analysis
6. Classical Measurement
X
HUDM 6051 Psychometric Theory I
7. Modern Measurement
X
HUDM 6052 Psychometric theory II
8. Research Design
X
HUD 4120 Methods of empirical research
9. Quantitative Program Evaluation
X
10. Qualitative Research
X
11. Qualitative Methods
X
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
HUDM 5124 Multidimensional scaling and
clustering
Table 7. Florida State University Measurement and Statistics
Doctoral Education/Pyschology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
STA 5323 Introduction to Mathematical Statistics
2. z/t test
X
EDF 5400 Descriptive/Inferential Statistics
Applications
3. Anova/Ancova/Regression
X
EDF 5402 Advanced Topics in ANOVA
4. Multivariate
X
EDF 5406 Multivariate Analysis Applications
5. SEM
X
6. Classical Measurement
X
EDF 5432 Measurement Theory I
7. Modern Measurement
X
EDF 5434 Measurement Theory II
8. Research Design
X
EDF 5481 Methods of Educational Research
9. Quantitative Program Evaluation
X
10. Qualitative Research
X
11. Qualitative Methods
X
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
EDF 7418 Hierarchical Linear Models
36
Table 8. George Mason University Research Methodology
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
2. z/t test
X
3. Anova/Ancova/Regression
X
4. Multivariate
X
EDRS 822: Advanced Applications of
Qualitative Methods
5. SEM
X
EDRS 831: Structural Equation Modeling
6. Classical Measurement
X
EDRS 827
7. Modern Measurement
X
EDRS 827: Introduction to Measurement and
Survey Development
8. Research Design
X
EDRS 827: Introduction to Measurement and
Survey Development
9. Quantitative Program Evaluation
10. Qualitative Research
X
EDRS 812, 822
11. Qualitative Methods
X
EDRS 822: Advanced Applications of
Qualitative Methods
12. Qualitative Program Evaluation
X
EDRS 820: Evaluation Methods for
Educational Programs and Curricula
13. Other (e.g., Sampling)
X
EDRS 830: Hierarchical Linear Modeling
Table 9. Kent State University Evaluation and Measurement
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number
1. Descriptive Statistics
X
EVAL 75510 Statistics I for Educational
Service
2. z/t test
X
EVAL 78716 Educational Statistics II
3. Anova/Ancova/Regression
X
EVAL 78728 Educational Statistics III
4. Multivariate
X
EVAL 78713 Multivariate Analysis in
Educational Research
5. SEM
X
EVAL 68735 Structural Equation
Modeling
6. Classical Measurement
X
EVAL 78710 Classical Test Theory
7. Modern Measurement
X
EVAL 78711 Modern Test Theory
8. Research Design
X
EVAL 65515 Quantitative Reearch Design
and Analysis
9. Quantitative Program Evaluation
X
EVAL 85515
10. Qualitative Research
X
EVAL 85518 Advanced Qualitative
Research
11. Qualitative Methods
X
EVAL 65516, 85516 Qualitative research
Design
12. Qualitative Program Evaluation
X
EVAL 88798 Research in Evaluation and
Measurement
13. Other (e.g., Sampling)
X
EVAL 68745 Hierarchical Linear
Modeling
37
Table 10. Ohio State Quantitative Research, Evaluation and Measurement
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
ESQREM 6641
2. z/t test
X
ESQREM 7648
3. Anova/Ancova/Regression
X
ESQREM 7651 Regression Analysis1
4. Multivariate
X
ESQREM 8648
5. SEM
X
ESQREM 8659
6. Classical Measurement
X
ESQREM 6661 Introduction to Educational
Measurement
7. Modern Measurement
X
ESQREM 7663, 8674
8. Research Design
X
ESQREM 7635 Advanced Research Methods
9. Quantitative Program Evaluation
X
ESQREM 8895 Seminars: Quantitative
Research, Evaluation, and Measurement
10. Qualitative Research
X
11. Qualitative Methods
X
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
ESQREM 7627 Sampling Designs and Survey
Research Methods
Table 11. University of California, Berkeley Social Research Methodologies
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
2. z/t test
X
EDUC293A Quantitative methods (statistics)
course
3. Anova/Ancova/Regression
X
EDUC 275B Additional quantitative methods
(statistics) course
4. Multivariate
X
5. SEM
X
6. Classical Measurement
X
EDUC 274A Measurement in Education & the
Social Sciences
7. Modern Measurement
X
EDUC 274B Measurement in Education & the
Social Sciences
8. Research Design
EDUC 290F Course in Research Methods
9. Quantitative Program Evaluation
X
10. Qualitative Research
X
EDUC 271B Introduction to Qualitative Methods
11. Qualitative Methods
X
EDUC 243 Advanced Qualitative Methods
12. Qualitative Program Evaluation
X
EDUC 276C Practicum in Evaluation
13. Other (e.g., Sampling)
X
EDUC 275G Hierarchical & Longitudinal
Modeling
38
Table 12. University of North Carolina at Greensboro Educational Research Methodology
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
2. z/t test
X
ERM 680 Intermediate Statistical Methods in
Education
3. Anova/Ancova/Regression
X
ERM 681 Design and Analysis of Educational
Experiments
4. Multivariate
X
ERM 682 Multivariate Analysis
5. SEM
X
ERM 731 Structural Equation Modeling in
Education
6. Classical Measurement
X
ERM 726 Advanced Topics in Educational
Measurement Theory
7. Modern Measurement
X
ERM 771 Advanced Item Response Theory
8. Research Design
X
ERM 704 Methods of Educational Research
9. Quantitative Program Evaluation
X
10. Qualitative Research
X
11. Qualitative Methods
X
ERM 749 Foundations of Qualitative Research
Methods
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
ERM 732 Hierarchical Linear Modeling
Table 13. University of Boulder Colorado Research & Evaluation Methodology
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
SOCY 5111 Data 1: Introduction to Social
Statistics
2. z/t test
X
EDUC 8230 Quantitative Methods I
3. Anova/Ancova/Regression
X
EDUC 8240 Quantitative Methods II
4. Multivariate
X
5. SEM
X
PSYC 5761 Structural Equation Modeling
6. Classical Measurement
X
EDUC 8710 Measurement in Survey Research
7. Modern Measurement
X
EDUC 8720 Advanced Topics in Measurement
8. Research Design
X
SOCY 5031Research Design
9. Quantitative Program Evaluation
X
EDUC 7386 Educational Evaluation
10. Qualitative Research
X
11. Qualitative Methods
X
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
PSCI 7108 Special Topics
39
Table 14. University of Connecticut Research Methods, Measurement, and Evaluation
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
EPSY 5601Introduction to Educational Research
Methods
2. z/t test
X
EPSY 5605 Introduction to Quantitative Methods I
3. Anova/Ancova/Regression
X
EPSY 5607 Introduction to Quantitative Methods
II
4. Multivariate
X
EPSY 5613 Multivariate Analysis in Educational
Research
5. SEM
X
EPSY 6615 Structural Equation Modeling
6. Classical Measurement
X
EPSY 6636 Measurement Theory and Application
7. Modern Measurement
X
EPSY 6638 Advanced Item Response Theory
8. Research Design
9. Quantitative Program Evaluation
X
EPSY 6621 Program Evaluation
10. Qualitative Research
X
EPSY 6601 Methods and Techniques of
Educational Research
11. Qualitative Methods
X
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
EPSY 6611 Hierarchical Linear Models
Table 15. University of Florida Research and Evaluation Methodology
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
EDF 6403 Quantitative Foundations of
Educational Research
2. z/t test
X
EDF 6402 Quantitative Found in Educational
Research: Inferential Statistics
3. Anova/Ancova/Regression
X
EDF 6481 Quantitative Research Methods in
Education
4. Multivariate
X
EDF 7932 Multivariate Analysis in Educational
Research
5. SEM
X
EDF 7412 Structural Equation Modeling
6. Classical Measurement
X
EDF 6436 Theory of Measurement
7. Modern Measurement
X
EDF 7439 Item Response Theory
8. Research Design
X
EDF 6471 Survey Design and Analysis in
Educational Research
9. Quantitative Program Evaluation
X
EDF 7941 Evaluation of Educational Products
and Systems
10. Qualitative Research
X
EDF 6475 Qualitative Foundations of
Educational Research
11. Qualitative Methods
X
EDF 7483 Qualitative Data Collection:
Approaches and Techniques
12. Qualitative Program Evaluation
X
EDF 7941 Evaluation of Educational Products
and Systems
13. Other (e.g., Sampling)
X
40
Table 16. University of Illinois Chicago Measurement, Evaluation, Statistics, and Assessment
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
2. z/t test
X
EPSY 503 Essentials of Quantitative Inquiry in
Education.
3. Anova/Ancova/Regression
X
EPSY 505 Advanced Analysis of Variance and
Multiple Regression
4. Multivariate
X
EPSY 583. Multivariate Analysis of Educational
Data
5. SEM
X
6. Classical Measurement
X
EPSY 546 Educational Measurement
7. Modern Measurement
X
EPSY 551. Item Response Theory/Rasch
Measurement
8. Research Design
X
EPSY 509 Research Design in Education
9. Quantitative Program Evaluation
X
EPSY 560 Educational Program Evaluation
10. Qualitative Research
X
11. Qualitative Methods
X
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
EPSY 584 Hierarchical Linear Models
Table 17. University of Illinois Urbana Quantitative and Qualitative Methodology, Measurement,
and Evaluation
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
2. z/t test
X
PSYC 506 Statistical Methods I
3. Anova/Ancova/Regression
X
PSYC 507 Statistical Methods II
4. Multivariate
5. SEM
X
EPSY 587, 588
6. Classical Measurement
X
EPSY 585/PSYC 595 Theories of
Measurement, 1
7. Modern Measurement
X
EPSY 586/PSYC 596 Theories of
Measurement, 2
8. Research Design
9. Quantitative Program Evaluation
X
EPOL 594
10. Qualitative Research
X
CI 509 Curriculum Research: QRM Qualitative
Research Methodology
11. Qualitative Methods
X
EPSY 577 Foundations of Qualitative Methods
12. Qualitative Program Evaluation
X
EPOL 594 Program Evaluation
13. Other (e.g., Sampling)
X
EPSY 587 Hierarchical Linear Models
41
Table 18. University of Iowa Educational Measurement and Statistics
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
2. z/t test
X
PSQF:6243 Intermediate Statistical Methods
3. Anova/Ancova/Regression
X
PSQF:6244 Correlation and Regression
4. Multivariate
X
PSQF:6252 Introduction to Multivariate
Statistical Methods
5. SEM
X
PSQF:6249 Factor Analysis and Structural
Equation Models
6. Classical Measurement
X
7. Modern Measurement
X
8. Research Design
EPLS:6206 Research Process and Design
9. Quantitative Program Evaluation
X
EPLS:6370
10. Qualitative Research
X
EDTL:7070 Qualitative Research Methods in
Teaching and Learning
11. Qualitative Methods
X
EPLS:7373 Qualitative Research Design and
Methods
12. Qualitative Program Evaluation
X
PSQF:6265 Program Evaluation
13. Other (e.g., Sampling)
X
Table 19. University of Kentucky Quantitative and Psychometric Methods
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
EPE 557 Gathering, Analyzing, and Using
Educational Data
2. z/t test
3. Anova/Ancova/Regression
X
EDP/EPE 707
4. Multivariate
X
EDP/EPE 707: Multivariate Analysis in
Educational Research
5. SEM
X
EDP/EPE 711: Advanced Quantitative Methods
6. Classical Measurement
X
EDP/EPE 679: Introduction to Measurement
Theory & Techniques
7. Modern Measurement
X
8. Research Design
X
EPE 619 Survey Research Methods in
Education
9. Quantitative Program Evaluation
X
ANT/EDP/EPE 621: Advanced Methods in
Evaluation
10. Qualitative Research
X
EPE 663: Field Studies in Educational Settings
11. Qualitative Methods
X
EPE 763: Advanced Field Studies
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
42
Table 20. University of Tennessee Evaluation, Statistics & Measurement
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
2. z/t test
X
EDPY 577 - Statistics in Applied Fields I
3. Anova/Ancova/Regression
X
EDPY 677 - Statistics in Applied Fields II
4. Multivariate
X
EDPY 678 - Statistics in Applied Fields III
5. SEM
X
IOP 627 - Structural Equation Models in
Organizational Research
6. Classical Measurement
X
EDPY 581 - Classroom Measurement
7. Modern Measurement
X
8. Research Design
EDPY 583 - Survey Research
9. Quantitative Program Evaluation
X
EDPY 660 - Evaluation, Statistics, and
Measurement Research Seminar
10. Qualitative Research
X
EDPY 661 - Advanced Qualitative Research in
Education
11. Qualitative Methods
X
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
Table 21. Virginia Research, Statistics & Evaluation
Doctoral Education/Pyschology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
2. z/t test
X
EDLF 5330 Quantitative Methods and Data
Analysis I
3. Anova/Ancova/Regression
X
EDLF 7420 Quantitative Methods and Data
Analysis II
4. Multivariate
X
EDLF 8350 Educational Statistics IV:
Multivariate
5. SEM
X
EDLF 8361 Structural Equation Modeling
6. Classical Measurement
X
EDLF 7180 Tests and Measurements
7. Modern Measurement
X
EDLF 8340 Measurement Theory
8. Research Design
X
EDLF 7410 Mixed Methods Research Design
9. Quantitative Program Evaluation
X
EDLF 7420 Quantitative Methods and Data
Analysis II
10. Qualitative Research
X
EDLF 7404 Qualitative Analysis
11. Qualitative Methods
X
EDLF 8440 Advanced Qualitative Analysis
12. Qualitative Program Evaluation
X
13. Other (e.g., Sampling)
X
43
Table 22. University of Washington Measurement & Statistics
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
EDPSY 490 Basic Educational Statistics
2. z/t test
X
EDPSY 491 Intermediate Educational Statistics
3. Anova/Ancova/Regression
X
EDPSY 593 Experimental Design/Analysis of
Variance
4. Multivariate
X
5. SEM
X
EDPSY 575 Structural Equation Modeling
6. Classical Measurement
X
EDPSY 592 Advanced Educational
Measurements
7. Modern Measurement
X
EDPSY 595 Item Response Theory Models of
Testing
8. Research Design
X
EDPSY 588 Survey Research Methods
9. Quantitative Program Evaluation
X
10. Qualitative Research
X
11. Qualitative Methods
X
EDPSY 584 Quantitative Methods Seminars
12. Qualitative Program Evaluation
X
EDPSY 596 Program Evaluation
13. Other (e.g., Sampling)
X
EDPSY 576 Hierarchical Linear Modeling
Table 23. Washington State University Educational Psychology
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
2. z/t test
X
ED_PSYCH 508
3. Anova/Ancova/Regression
X
ED_RES 565, 538
4. Multivariate
X
5. SEM
X
ED_PSYCH 575, 576
6. Classical Measurement
X
ED_PSYCH 509
7. Modern Measurement
X
ED_PSYCH 577 Item Response Theory
8. Research Design
X
ED_PSYCH 505 Research Methods
9. Quantitative Program Evaluation
X
ED_PSYCH 571 Theoretical Foundations... in
Program Evaluation
10. Qualitative Research
X
ED_PSYCH 507 Foundations of Qualitative
Research
11. Qualitative Methods
X
EDPSY 586 Qualitative Methods of Educational
Research
12. Qualitative Program Evaluation
X
EDPSY 596 Program Evaluation
13. Other (e.g., Sampling)
44
Table 24. Wayne State University Education Evaluation & Research
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
X
EER 7610 Evaluation and Measurement
2. z/t test
X
EER 7630 Fundamentals of Statistics
3. Anova/Ancova/Regression
X
EER 8800 Variance and Covariance Analysis
4. Multivariate
X
EER 8820 Multivariate Analysis
5. SEM
X
EER 8840 Structural Equations Modeling
6. Classical Measurement
X
EER 8760 Advanced Measurement I
7. Modern Measurement
X
EER 8770 Advanced Measurement II
8. Research Design
X
EER 8992 Research and Experimental Design
9. Quantitative Program Evaluation
X
EER 8720 Advanced Quantitative Program
Evaluation
10. Qualitative Reseasch
X
EER 8530 Qualitative Research III: Data Analysis
and Reporting
11. Qualitative Methods
X
EER 8550 Advanced Qualitative Inquiry:
Innovations in Theory
12. Qualitative Program Evaluatin
X
EER 8910 Practicum in Evaluation
13. Other (e.g., Sampling)
EER 8860 Nonparametric, Permutation, Exact, &
Robust Methods; EER 8880 Monte Carlo Methods
Table 25. Western Michigan University Educational Leadership, Research and Technology
Doctoral Education/Psychology
Research, Measurement, Statistics, Evaluation
Yes
No
Course Number and Description
1. Descriptive Statistics
EMR 5400 Fundamentals of Evaluation,
Measurement and Research
2. z/t test
3. Anova/Ancova/Regression
4. Multivariate
X
EMR 6750 Applied Multivariate Statistics
5. SEM
X
EMR 6710 Structural Equation Modeling
6. Classical Measurement
EMR 6610 Advanced Seminar: Measurement
7. Modern Measurement
8. Research Design
X
EMR 6500
9. Quantitative Program Evaluation
X
EMR 6420 Evaluation I: Theory Methods &
Program Evaluation
10. Qualitative Research
X
EMR 6480 Qualitative Research Designs
11. Qualitative Methods
X
EMR 6680
12. Qualitative Program Evaluation
EMR 6520 Evaluation Practicum
13. Other (e.g., Sampling)
EMR 6850 Hierarchical Linear modeling
45
2. What are the h-index, g-index, number of citations, and number of papers produced by the
faculty in the EER or equivalent departments/programs at the twenty three universities?
Sketch of 23 University’s Faculty Metrics
The various metrics in this research question were obtained by reviewing the webpage for
each of the 23 universities. A brief summary of the program, as well as the overall metrics for each
university, including non-tenured lecturers, for each university is presented below.
Boston College
The Lynch School of Education and Human Development offers a Ph. D. program in
Measurements, Evaluation, Statistics, and Assessment (MESA) at Boston College. The program
requires each MESA student to complete 72 credit hours. After the students finish their coursework
they are expected to enroll for a dissertation. The Ph. D. program has a website with program
requirements and general information concerning the process for completing the Ph. D. degree.
The mission statement is to
Contribute to national and international efforts to solve critical educational and
human problems in a diverse global community. Educate the next generation of
scholars and practitioners who will engage in reflective social inquiry. Foster
meaningful scholarly inquiry into both theoretical and applied aspects of social
systems. Advance creative approaches to important measurement, evaluation, and
research problems in educational and other social science fields. Provide students
with the necessary theoretical and applied skills and experience to become research
and evaluation leaders in their professional fields. Prepare students to make original
and substantive contributions to the fields of measurement, evaluation, and
statistics. (https://www.bc.edu/content/bc-web/schools/lynch
school/academics/departments/mesa/phd-mesa.html)
The program consists of 10 faculty members, of which six are professors, three are
associate professors, and one assistant professor. As a department, their mean (median) h-index is
28.20 (28.50), the standard deviation is 17.89, and the minimum, maximum is 6.00, 56.00.
46
Brigham Young University
The School of David O. Mckay offers a Ph. D. program in the department of Educational
Inquiry, Measurement and Evaluation (EIME) at Brigham Young University. The program
requires each EIME Ph.D. students to complete 71 credit hours. After the students finish their
coursework, they are expected to enroll in 6 hours of internships and 18 credit hours of dissertation.
The Ph. D. program has a website with program requirements and general information concerning
the process for completing the Ph. D. degree. The program mission statement is
The mission of the Educational Inquiry, Measurement, and Evaluation (EIME)
program is to prepare doctoral students who have the knowledge, expertise,
experience, and character traits to work productively as researchers, evaluators,
policy analysts, assessment specialists, and/or professors in a variety of settings.
(https://education.byu.edu/eime/directory)
The program consists of eight faculty members, of which three are assistant professors, three are
associate professors, and two are professors. As a department, their mean (median) h-index is
14.50 (16.00), the standard deviation is 5.81, and the minimum, maximum is 4.00, 22.00.
Claremont Graduate University
The School of Social Science Policy & Evaluation offers a Ph. D. in the Division of
Behavioral & Organizational Science at Claremont Graduate University. The program requires
each student to complete 72 credit hours. The Ph. D. program has a website with program
requirments and general information concerning the process for completing the Ph. D. degree. The
program mission statement is
The Evaluation & Applied Research Methods PhD program focuses on training
students in the design and implementation of impactful evaluations that improve
the lives of people across a range of settings, including federal health agencies,
educational programs, philanthropic foundations, academia, and more.
(https://www.cgu.edu/academics/program/evaluation-applied-research-methods/)
47
There are nine faculty members, of which seven are professors and two are assistant
professors. As a department their mean (median) h-index is 31.56 (30), the standard
deviation is 21.36, and the minimum, maximum is 4.00, 69.00.
Columbia University
The Department of Human Development offers a Ph. D. program of Measurement,
Evaluation, and Statistics in the Teachers College at Columbia University. The program requires
each student to complete 75 credit hours. The Ph. D. program has a website with program
requirements and general information concerning the process for completeing the Ph. D. degree.
The programs goals are
The Measurement, Evaluation, & Statistics program provides training for a number
of careers in a wide range of applied research settings, testing organizations, and
industries. Graduate students pursue a course of study in either Applied Statistics
or Measurement & Evaluation. We offer specialized knowledge in test theory,
experimental design, instrument development and validation, and quantitative
analysis of educational and psychological data.
https://www.tc.columbia.edu/human-development/measurement-evaluation-and-
statistics/
The program consists of six faculty members, of which two are professors, two are assistant
professors, and two are lecturers. As a department their mean (median) h-index is 12.50 (10.00),
the standard deviation is 8.17, and the minimum, maximum is 4.00, 24.00.
Florida State University
The Department of Educational Psychology and Learning Systems offers a Ph. D program
in Measurement and Statistics (M&S) at Florida State University. The program requires each MS
and Ph. D. student to complete specific coursework requirements. The Ph.D. program has a website
with program requirements and general information concerning the process for completing the Ph.
D. degree. The program has no mission statement. The url for the program is
48
https://education.fsu.edu/sites/g/files/upcbnu3146/files/degree-requirements-in-Doctoral-
program.pdf
The program consists of five faculty members, of which two are professors, two are
associate professors, and one assistant professor. As a department, their mean (median) h-index is
19.60 (17.00), the standard deviation is 14.32, and the minimum, maximum is 0.36.
George Mason University
The School of Education offers a Ph. D. program in Research Methodology in The College
of Education and Human Development at George Mason University. The program requires each
Reseach Methodoilogy student to complete a minimum of 55 credit hours beyond the Master’s
degree and 12 credit hour of dissertation. The program has a website with program requirements
and general information concerning the process for completing the Ph. D. degree. The program
url is https://cehd.gmu.edu/directory
The program consists of fourteen faculty members, of which six are professors, three are
associate professors, and five are assistant professors. As a department, their mean (median) h-
index is 15.20 (8.00), the standard deviation is 13.87, and the minimum, maximum is 1.00, 51.00.
Kent State University
The School of Foundation, Leadership, and Administration offers a Ph. D. in Evaluation
and Measurement (EVAL) at Kent State University. The program reuires each EVAL student to
complete 51 credit hours. After finishing their coursework, they are expected to enroll in 30 hours
of dissertation. The Ph. D. program has a website with program requirements and general
information concerning the process for completing the Ph. D. degree. There is no mission
statement in their website. The program website is https://www.kent.edu/ehhs/fla/eval/faculty-
staff.
49
The program consists of five faculty members, of which one is a professor and four are
associate professors. As a department their mean (median) h-index is 8.80 (11.00), the standard
deviation is 6.98, and the minimum, maximum is 0.00, 16.00.
The Ohio State University
The Department of Educational Studies offers a Ph. D. program in Quantitative Research,
Evaluation and Measurement (QREM) in the College of Education and Human Ecology at the
Ohio State University. The program requires each QREM students to complete a minimum of 87
credit hours. After the students finish their coursework, the students are required to enroll in
research apprenticeship, to take a candidacy exam, and to complete a 6-credit hour of dissertation.
The Ph. D. program has a website with program requirements and general information concerning
the process for completing the Ph. D. degree. Their mission statement states:
The Quantitative Research, Evaluation and Measurement (QREM) program
prepares you to become an expert in research design and statistics, program
evaluation, and applied measurement and testing. These skills are an essential
component as you seek careers in education, government or business settings.
The program url is https://ehe.osu.edu/educational-studies/qrem/phd/
The program consists of five faculty members, of which two are professors, two are
assistant professors, and one associate professor. As a department their mean (median) h-index is
17.60 (19.00), the standard deviation is 6.31, and the minimum, maximum is 11.00, 24.00.
University of California-Berkeley
The Graduate School of Education offers a Ph. D. program in the Department of Social
Research Methodologist (SRM) at the University California Berkeley (UC Berkeley). The
program requires each SRM to have an advisor to help with navigation in their vague instructions
for the Ph. D. degree. The program has a student’s handbook that is part of the School of
Psychology Program. The Ph. D. program has a website with program requirements and general
50
information concerning the process for completing the Ph. D. degree. The program has no mission
statement. The program url is https://gse.berkeley.edu/academics/maphd-program/social-
research-methodologies.
The program consists of two members and both are professors. As a department, their mean
(median) h-index is 38.50 (38.50), the standard deviation is 37.48, and the minimum, maximum is
12.00, 65.00.
University of North Carolina-Greensboro
The School of Education offers a Ph. D. in the Department of Educational Research
Measurement and Evaluation (ERM) at the University North Carolina Greensboro. The program
requires each ERM students to complete 60 credit hours. After the students finish their coursework,
they are expected to enroll in 12 hours of dissertation. The Ph. D. program has a website with the
program requirements and general information concerning the process for completing the Ph. D.
degree. The program mission statement is not available and the website is
https://soe.uncg.edu/academics/departments/erm/.
The program consist of seven faculty members, of which three are professors, three are
assistant professors and one associate professor. As a department their mean (median) h-index is
7.57 (7.00), the standard deviation is 6.75, and the minimum, maximum is 0.00, 18.00.
University of Colorado-Boulder
The School of Education offers a Ph. D. degree in the Department of Research and
Evaluation Methodology (REM) at the University of Colorado-Boulder. The program requires
each REM students to complete 56 credit hours of approved coursework, with 23 credit hours of
core courses. The Ph. D. program REM student must complete 30 credit hours of dissertation. The
Ph. D. program has a website with program requirements and general information concerning the
51
process for completing the Ph. D. degree. The REM students are expected to participate in the
Center for Assessment, Design, Research and Evaluation (CADRE). The mission of CADRE is
To produce generalizable knowledge that improves the ability to assess student
learning and to evaluate programs and methods that may have an effect on this
learning. CADRE projects represent a collaboration with the ongoing activities in
the School of Education, the university, and the broader national and international
community of scholars and stakeholders involved in educational assessment and
evaluation. (https://www.colorado.edu/education/academics/graduate-
programs/research-evaluation-methodology/phd-education-emphasis-research)
The program consists of five faculty members, of which one is a professor, two are assistant
professors, and two are associate professors. As a department their mean (median) h-index is 14.40
(11.00), the standard deviation is 10.08, and the minimum maximum is 4.00, 27.00.
University of Connecticut
The Neag School of Education offers a Ph. D. program in the Department of Research
Methods, Measurement, and Evaluation (RMME) which is part of the Department of Educational
Psychology (EPSY) at the University of Connecticut. The program requires each RMME Ph. D.
student to complete a minimum of 75 credit hours. This includes 51-57 credits of coursework, 3-
9 credits of independent study, internship, or practicum credit, and 15 credits of dissertation. The
Ph. D. program has a website with program requirements and general information concerning the
process for completing the Ph. D. degree. The program philosophy and goals can be found in the
RMME handbook.
The faculty is committed to a learning environment that stresses a well-organized and
explicit curriculum with clear expectations. However, there is also a strong commitment to student
faculty interaction that further encourages the student’s professional development and
identification within the field. In addition, the program is designed to acquaint students with the
diversity of theories and practices within the field of Research Methods, Measurement and
52
Evaluation, allowing sufficient intellectual freedom to experiment with different theoretical and
applied approaches. The url is https://rmme.education.uconn.edu/core-program-faculty/
The program consists of seven faculty members, of which one is an assistant professor,
four are associate professors, and two are professors. As a department their mean (median) h-index
is 21.28 (21.00), the standard deviation is 18.27, and the minimum, maximum is 4.00, 54.00.
University of Florida
The School of Development and Organizational Studies in Education offers a Ph. D.
program in the Department of Research and Evaluation Methodology (REM) at the University of
Florida College of Education. The program requires each student to complete 90 credit hours. After
the students finish their coursework, their final product is a written exam and a dissertation. The
Ph. D. Program has a website with program requirements and general information concerning the
process for completing the Ph. D. degree. The website has no mission statement. The program url
is https://education.ufl.edu/research-evaluation-methods/contact-us/.
The program consists of seven faculty members, of which three are professors, two are
assistant professors, one associate professor, and one lecturer. As a department, their mean
(median) h-index is 11.14 (7.00), the standard deviation is 12.23, and the minimum, maximum is
1.00, 37.00.
University of Illinois Chicago
The Department of Psychology in the College of Education at the University of Illinois
Chicago offers a Ph. D. program in Measurement, Evaluation, Statistic, and Assessment (MESA).
The Program requires MESA Ph. D. students to complete 64 credit hours beyond master’s degree
including the dissertation. The Ph. D. program has a website with program requirements and
general information concerning the process for completing the Ph. D. degree. The program has no
53
mission statement on the webpage. The program url is
https://education.uic.edu/academics/programs/educational-psychology/mesa-doctoral-faculty/.
The MESA program consists of six members, of which three are professors, two are
associate professors, and one assistant professor. As a department their mean (median) h-index is
12.67 (13.50), the standard deviation is 7.87, and the minimum, maximum is 0.00, 22.00.
University of Illinois-Urbana
The Educational Psychology Department in the College of Education offers a Ph. D.
Degree in Quantitative Methodology, Measurement and Evaluation (QUERIES) at the University
of Illinois-Urbana. The program requires each QUERIES Ph. D. student to complete 96 credit
hours including dissertation. The Ph. D. program has a website with program requirements and
general information concerning the process for completing the Ph. D. degree. The program url is
https://education.illinois.edu/edpsy/programs-degrees/queries/faculty.
The program consists of 10 members, of which three are professors, two are associate
professors, and five are assistant professors. As a department their mean (median) h-index is 8.70
(5.00), the standard deviation is 9.18, and the minimum, maximum is 0.00, 27.00.
University of Iowa
The Psychological and Quantitative Foundations Department offers a Ph. D. program in of
Educational Measurement and Statistic (EMS) in the College of Education at the University of
Iowa. The program requires each EMS Ph. D. students complete 90 credit hours beyond the
bachelor’s degree and a dissertation. The Ph. D. program has a website with program requirements
and general information concerning the process for completing the Ph. D. degree.
54
The EMS program is committed to preparing students for successful careers in educational
measurement, evaluation, research, and statistical/quantitative analysis. The program url is
https://education.uiowa.edu/academic-programs/educational-measurement-and-statistics/faculty.
The program conssits of 10 faculty members, of which seven are professors, two are
assistant professors, and one associate professor. As a department their mean (median) h-index is
18.10 (18.00), the standard deviation is 9.82, and the minimum, maximum is 4.00, 36.00.
University of Kentucky
The Department of Educational, School, and Counseling Psychology (EDP) offers a Ph.
D. program in Qualitative and Psychometric Methods (QPM) at the University of Kentucky. The
program requires each QPM student to complete coursework and a dissertation. The Ph. D.
program has a website with program requirements and general information concerning the process
for completing the Ph. D. degree. The purpose of the QPM Ph. D. program stated in the handbook
(p. 4).
The primary objective of the QPM program is to promote the development of advanced
quantitative and psychometric knowledge and skills that allow program graduates to function as
competent independent researchers or scientists who can innovatively and effectively carry out
research design and data analysis for all kinds of empirical purposes. The program urls are
https://education.uky.edu/edp/qpm/meet-the-faculty/ and https://education.uky.edu/wp-
content/uploads/2017/03/QPM-Program-Handbook.pdf.
The program consists of three faculty members, of which all are professors. As a
department their mean (median) h-index is 14.33 (12.00), the standard deviation is 8.74, and the
minimum, maximum is 7.00, 24.00.
55
University of Tennessee
The Educational Psychology and Counseling Department offers a Ph. D. program in
Evaluation, Statistics & Measurement (ESM) at the University of Tennessee. The program requires
each ESM students to complete 63 credit hours. After the students finish their coursework, they
are expected to enroll in 24 credit hours of dissertation. The Ph. D. program has a website with
program requirements and general information concerning the process for completing the Ph. D
degree. The purpose and the goal of the Ph. D. program are the following:
The Ph. D. program in Evaluation, Statistics, and Measurement (ESM) has been
carefully designed to provide students with an integrated, sequenced, and
experientially-based doctoral program leading to a meaningful professional career.
https://epc.utk.edu/evaluation-statistics-measurement/
The program consists of four members, of which one is an assistant professor and three are
associate professors. As a department their mean (median) is 7.25 (6.50), the standard deviation is
5.79, and the minimum, maximum is 1.00, 15.00.
University of Virginia
The Curry School of Education and Human Development offers a Ph. D. in the department
of Education Leadership, Foundations, and Policy (EDLF) in Research, Statistics, and Evaluation
(RSE) at the University of Virginia. The program requires each RSE Ph. D. students to complete
72 credit hours. After the students finish their coursework, they enroll in 12 hours of dissertation.
The Ph. D. program has a website with program requirements and general information concerning
the process for completing the Ph. D. degree. The program’s goals are
All Ph. D. programs in the Curry School are designed to prepare professors and
scholars with demonstrated ability to conduct research in their field of study.
Programs may establish additional requirements and goals consistent with their
field.
(https://curry.virginia.edu/sites/default/files/uploads/resourceLibrary/RSE%20Ph
D%20Handbook%20Feb%202018.pdf;
56
https://curry.virginia.edu/academics/departments/department-leadership-
foundations-and-policy)
The program consists of seven faculty in which three are professors, two are associate
professors, and two are assistant professors. As a department their mean (median) h-index is 16.42
(14.00), the standard deviation is 11.84, and the minimum, maximum is 2.00, 35.00.
University of Washington
The Educational Psychology Department in the College of Education of the University of
Washington offers the program in Measurement and Statistics (M&S). The program requires each
M&S student to complete a minimum number of coursework. After the students finish their
coursework requirements, they enroll in 27 credit hours of dissertation. The program has a website
with program requirements and general information concerning the process for completing the Ph.
D. The program url is https://education.uw.edu/programs/graduate/educational-
psychology/measurement-and-statistic.
The program consists of four members, of which three are associate professors and one
assistant professor. As a department their mean (median) h-index is 8.50 (5.00), the standard
deviation is 9.15, and the minimum, maximum is 2.00, 22.00.
Washington State University
The Department of Kinesiology and Educational Psychology offers a Ph. D. program in
Research, Evaluation, Measurement, Learning ( REML) and Cognition at the Washington State
University. The program requires each REML student to complete 72 semester hours, including at
least 36 semester hours of graded course work and at least 24 semester hours for completion and
defense of the doctoral dissertation. The Ph. D. program has a website with program requirements
and general information concerning the process for completing the Ph. D. degree. The program
mission statement is
57
The mission of the Department of Educational Leadership, Sport Studies, and
Educational/ Counseling Psychology (ELSSECP) is to address the needs of
communities, individuals, and educational institutions in a diverse society through
leadership, scholarship, collaboration and professional practice.
https://education.wsu.edu/graduate/edpsych/educational-psychology-faculty/
The program consists of eight members, of which two are professors, two associate,
professors, and four assistant professors. As a department their mean (median) h-index is 10.13
(8.50), the standard deviation is 9.31, and the minimum, maximum is 1.00, 27.00.
Wayne State University
The Educational Leadership and Policy Studies offers a Ph. D. program in Evaluation and
Research (EER) at Wayne State University (WSU). The program requires each EER student to
complete 97 credit hours including 30 credit hours in dissertation. The Ph. D. program has a
website with program requirements and general information concerning the process for completing
the Ph. D. degree. The program mission statement can be found in the EER students handbook:
Educational Evaluation and Research offers concentrated programs for building
careers and leadership positions in educational statistics, research, measurement,
and evaluation. These programs are designed for students who have training and
experience in substantive disciplines in either education or non-education fields.
Proficiency and excellence will be acquired in scientific inquiry, quantitative and
qualitative research methodology and program evaluation, psychometry and
construction of psychological and educational tests, and applied statistical analysis
of social and behavioral data, especially using computer technology.
https://coe.wayne.edu/aos/eer/faculty.php
The faculty consists of three professors. As a department their mean (median) h-index is
21.67 (21.00), the standard deviation is 13.01, and the minimum, maximum is 9.00, 35.00.
Western Michigan University
The Department of Education Leadership, Research and Technology at the College of
Education and Human Development offers a Ph. D. program in Evaluation, Measurement and
58
Research (EMR) at the Western Michigan University. The Ph. D. program requires EMR students
to complete total of 93 credit hours including 15 credit hours for dissertation. The Ph. D. Program
has a website with program requirements and general information concerning the process for
completing the Ph. D. degree. The program url is
https://wmich.edu/leadership/academics/emr/faculty.
The program consists of seven faculty members, of which four are professors, three
assistant professors, and an associate professor. As a department their mean (median) h-index is
18.42 (17.00), the standatd deviation is 16.28, and the minimum, maximum is 1.00, 46.00.
h-index, g-index, citations, publications
In Table 26, the mean h-index, g-index, number of citations, and number of publicatios
are presented based on tenure/tenure track faculty. The tables are arranged with the mean h-index
sorted from high to low. In Table 27, the results are given for the median.
Table 26. Mean h-index, g-index, Number of Citations, and Publications for 23 Universities.
School
h-index
g-index
Citations
Publications
University California Berkeley
38.5
95
14545.50
113.5
Claremont Graduate University
30.78
63.44
8420.78
116.89
Boston College
28.2
61.60
6381.90
97.40
Wayne State University
21.67
42.67
2350.67
116.67
University of Connecticut
21.29
46.71
5863.86
52.00
Florida State University
19.60
45.40
5761.80
55.20
Western Michigan University
18.43
37.71
2715.29
56.00
University of Iowa
18.10
38.00
2270.10
40.80
Ohio State University
17.60
32.40
1297.40
141.00
University of Virginia
16.43
31.71
1688.29
33.29
George Mason University
15.21
30.07
2240.86
36.07
Brigham Young
14.50
31.29
1237.86
38.86
University of Boulder Colorado
14.40
29.20
2454.60
60.40
University of Kentucky
14.33
30.00
1177.00
41.33
University of Illinois Chicago
12.67
27.67
1091.67
99.33
Columbia
12.50
22.83
1228.17
28.17
University of Florida
11.14
22.00
1590.43
25.14
59
Washington State University
10.13
20.75
1065.00
27.13
Kent State
8.80
18.40
1087.4
18.40
University of Illinois Urbana
8.70
17.20
901.60
20.20
University of Washington
8.50
14.25
719.50
15.50
University North Carolina Greensboro
6.75
10.38
405.13
11.00
University of Tennessee
6.00
11
224.00
11.00
Table 27. Median h-index, g-index, Number of Citations, and Publications for 23 Universities.
School
h-index
g-index
Citations
Publication
University California Berkeley
38.50
95.00
14545.50
113.50
Claremont Graduate University
30.00
53.00
2924.00
70.00
Boston College
28.50
54.00
4772.50
56.50
University of Connecticut
21.00
45.00
5773.00
45.00
Wayne State University
21.00
36.00
1360.00
92.00
Ohio State University
19.00
36.00
1533.00
132.00
University of Iowa
18.00
35.00
1852.00
36.50
Florida State University
17.00
44.00
2019.00
50.00
Western Michigan University
17.00
39.00
1616.00
50.00
Brigham Young
16.00
35.00
1309.00
45.00
University of Virginia
14.00
31.00
1648.00
31.00
University of Illinois Chicago
13.50
33.50
1169.00
65.50
University of Kentucky
12.00
25.00
684.00
48.00
University of Boulder Colorado
11.00
23.00
572.00
27.00
Kent State
11.00
22.00
679.00
22.00
Columbia
10.00
13.50
320.00
19.00
Washington State University
8.50
19.50
428.00
20.00
George Mason University
8.00
15.00
599.50
22.50
University of Florida
7.00
11.00
143.00
13.00
University of Tennessee
6.50
13.50
262.50
13.50
University of Illinois Urbana
5.00
9.50
225.50
11.50
University North Carolina Greensboro
5.00
7.00
65.00
8.00
University of Washington
5.00
6.50
466.50
6.50
The mean ranking for individual metrics are presented in Tables 28 31.
Table 28. Mean h-index for 23 Universities
School
h-index
University California Berkeley
38.50
Claremont Graduate University
30.78
Boston College
28.20
Wayne State University
21.67
University of Connecticut
21.29
60
Florida State University
19.60
University of Iowa
18.43
Western Michigan University
18.10
Ohio State University
17.60
University of Virginia
16.43
Brigham Young
15.21
George Mason University
14.40
University of Kentucky
14.33
University of Boulder Colorado
14.29
University of Illinois Chicago
12.67
Columbia
12.50
University of Florida
11.14
Washington State University
10.13
Kent State
8.80
University of Illinois Urbana
8.70
University of Washington
8.50
University of Tennessee
6.75
University North Carolina Greensboro
6.00
Table 29. Mean g-index for 23 Universities
School
g-index
University California Berkeley
95.00
Claremont Graduate University
63.44
Boston College
61.60
University of Connecticut
46.71
Florida State University
45.40
Wayne State University
40.33
University of Iowa
38.00
Western Michigan University
37.71
Ohio State University
32.40
University of Virginia
31.71
Brigham Young
31.29
George Mason University
30.07
University of Kentucky
30.00
University of Boulder Colorado
29.20
University of Illinois Chicago
27.67
Columbia
22.83
University of Florida
22.00
Washington State University
20.75
Kent State
18.40
University of Illinois Urbana
17.20
University of Washington
14.25
61
University of Tennessee
11.00
University North Carolina Greensboro
10.38
Table 30. Mean Number of Citations for 23 Universities
School
Citations
University California Berkeley
14545.5
Claremont Graduate University
8420.78
Boston College
6381.90
University of Connecticut
5863.86
Florida State University
5761.80
Western Michigan University
2715.29
University of Boulder Colorado
2454.60
Wayne State University
2350.67
University of Iowa
2270.10
George Mason University
2240.86
University of Virginia
1688.29
University of Florida
1590.43
Ohio State University
1297.40
Brigham Young
1237.86
Columbia
1228.17
University of Kentucky
1177.00
University of Illinois Chicago
1091.67
Kent State
1087.40
Washington State University
1065.00
University of Illinois Urbana
901.60
University of Washington
719.50
University North Carolina Greensboro
405.13
University of Tennessee
224.00
Table 31. Mean Number of Publications for for 23 Universities
School
Publications
Ohio State University
141.00
Claremont Graduate University
116.89
Wayne State University
116.67
University California Berkeley
113.50
University of Illinois Chicago
99.33
Boston College
97.40
University of Boulder Colorado
60.40
Western Michigan University
56.00
Florida State University
55.20
University of Connecticut
52.00
62
University of Kentucky
41.33
University of Iowa
40.80
Brigham Young
38.86
George Mason University
36.07
University of Virginia
33.29
Columbia
28.17
Washington State University
27.13
University of Florida
25.14
University of Illinois Urbana
20.20
Kent State
18.40
University of Washington
15.50
University North Carolina Greensboro
11.00
University of Tennessee
11.00
The median ranking for individual metrics are presented in Tables 32 35.
Table 32. Median h-index for 23 Universities
School
h-index
University California Berkeley
38.5
Claremont Graduate University
30.00
Boston College
28.50
Wayne State University
21.00
University of Connecticut
21.00
Ohio State University
19.00
University of Iowa
18.00
Florida State University
17.00
Western Michigan University
17.00
Brigham Young
16.00
University of Virginia
14.00
University of Illinois Chicago
13.50
University of Kentucky
12.00
University of Boulder Colorado
11.00
Kent State
11.00
Columbia
10.00
Washington State University
8.50
George Mason University
8.00
University of Florida
7.00
University of Tennessee
6.50
University of Illinois Urbana
5.00
University of Washington
5.00
University North Carolina Greensboro
5.00
63
Table 33. Median g-index for 23 Universities.
School
g-index
University California Berkeley
95.00
Boston College
54.00
Claremont Graduate University
53.00
University of Connecticut
45.00
Florida State University
44.00
Western Michigan University
39.00
Wayne State University
36.00
Ohio State University
36.00
University of Iowa
35.00
Brigham Young
35.00
University of Illinois Chicago
33.50
University of Virginia
31.00
University of Kentucky
25.00
University of Boulder Colorado
23.00
Kent State
22.00
Washington State University
19.50
George Mason University
15.00
Columbia
13.50
University of Tennessee
13.50
University of Florida
11.00
University of Illinois Urbana
9.50
University North Carolina Greensboro
7.00
University of Washington
6.50
Table 34. Median Number of Citations for 23 Universities
School
Citations
University California Berkeley
14545.50
University of Connecticut
5773.00
Boston College
4772.50
Claremont Graduate University
2924.00
Florida State University
2019.00
University of Iowa
1852.00
University of Virginia
1648.00
Western Michigan University
1616.00
Ohio State University
1533.00
Wayne State University
1360.00
Brigham Young
1309.00
University of Illinois Chicago
1169.00
University of Kentucky
684.00
Kent State
679.00
64
George Mason University
599.50
University of Boulder Colorado
572.00
University of Washington
466.50
Washington State University
428.00
Columbia
320.00
University of Tennessee
262.50
University of Illinois Urbana
225.50
University of Florida
143.00
University North Carolina Greensboro
65.00
Table 35. Median Number of Publications for for 23 Universities.
School
Publications
Ohio State University
132.00
University California Berkeley
113.50
Wayne State University
92.00
Claremont Graduate University
70.00
University of Illinois Chicago
65.50
Boston College
56.50
Florida State University
50.00
Western Michigan University
50.00
University of Kentucky
48.00
University of Connecticut
45.00
Brigham Young
45.00
University of Iowa
36.50
University of Virginia
31.00
University of Boulder Colorado
27.00
George Mason University
22.50
Kent State
22.00
Washington State University
20.00
Columbia
19.00
University of Tennessee
13.50
University of Florida
13.00
University of Illinois Urbana
11.50
University North Carolina Greensboro
8.00
University of Washington
6.50
3. What is the standardized rankings that captures the hierarchical rank of the twenty-three
EER or equivalent departments/programs?
65
The following series of tables (36 39) contain the Blom transformed z scores of the raw
scores from the tables in the previous sections.
Table 36. The Standardized Mean h-index for 23 Universities
School
h-index
University California Berkeley
1.9287
Claremont Graduate University
1.4766
Boston College
1.2112
Wayne State University
1.0114
University of Connecticut
0.8455
Florida State University
0.7001
Western Michigan University
0.5682
University of Iowa
0.4456
Ohio State University
0.3293
University of Virginia
0.2173
George Mason University
0.1080
University of Boulder Colorado
0.0000
University of Kentucky
-0.1080
Brigham Young
-0.2173
University of Illinois Chicago
-0.3293
Columbia
-0.4456
University of Florida
-0.5682
Washington State University
-0.7001
Kent State
-0.8455
University of Illinois Urbana
-1.0114
University of Washington
-1.2112
University North Carolina Greensboro
-1.4766
University of Tennessee
-1.9287
Table 37. The Standardized Mean g-index for 23 Universities
School
g-index
University California Berkeley
1.9287
Claremont Graduate University
1.4766
Boston College
1.2112
University of Connecticut
1.0114
Florida State University
0.8455
Wayne State University
0.7001
University of Iowa
0.5682
Western Michigan University
0.4456
Ohio State University
0.3293
University of Virginia
0.2173
Brigham Young
0.1080
66
George Mason University
0.0000
University of Kentucky
-0.1080
University of Boulder Colorado
-0.2173
University of Illinois Chicago
-0.3293
Columbia
-0.4456
University of Florida
-0.5682
Washington State University
-0.7001
Kent State
-0.8455
University of Illinois Urbana
-1.0114
University of Washington
-1.2112
University of Tennessee
-1.4766
University North Carolina Greensboro
-1.9287
Table 38. The Standardized Mean Citations for 23 Universities
School
Citations
University California Berkeley
1.9287
Claremont Graduate University
1.4766
Boston College
1.2112
University of Connecticut
1.0114
Florida State University
.8455
Western Michigan University
.7001
University of Boulder Colorado
.5682
Wayne State University
.4456
University of Iowa
.3293
George Mason University
.2173
University of Virginia
.1080
University of Florida
.0000
Ohio State University
-.1080
Brigham Young
-.2173
Columbia
-.3293
University of Kentucky
-.4456
University of Illinois Chicago
-.5682
Kent State
-.7001
Washington State University
-.8455
University of Illinois Urbana
-1.0114
University of Washington
-1.2112
University North Carolina Greensboro
-1.4766
University of Tennessee
-1.9287
67
Table 39. The Standardized Mean Publications for 23 Universities
School
Publications
Ohio State University
1.9287
Wayne State University
1.4766
Claremont Graduate University
1.2112
University California Berkeley
1.0114
University of Illinois Chicago
0.8455
Boston College
0.7001
University of Boulder Colorado
0.5682
Western Michigan University
0.4456
Florida State University
0.3293
University of Connecticut
0.2173
University of Kentucky
0.1080
University of Iowa
0.0000
Brigham Young
-0.1080
George Mason University
-0.2173
University of Virginia
-0.3293
Columbia
-0.4456
Washington State University
-0.5682
University of Florida
-0.7001
University of Illinois Urbana
-0.8455
Kent State
-1.0114
University of Washington
-1.2112
University North Carolina Greensboro
-1.6607
University of Tennessee
-1.6607
The standardized median of 23 Univerisites are presented in Tables 40-43.
Table 40. The Standardized Median h-index for 23 Universities
School
h-index
University California Berkeley
1.9287
Claremont Graduate University
1.4766
Boston College
1.2112
Wayne State University
0.9252
University of Connecticut
0.9252
Ohio State University
0.7001
University of Iowa
0.5682
Florida State University
0.3868
Western Michigan University
0.3868
Brigham Young
0.2173
University of Virginia
0.1080
University of Illinois Chicago
0.0000
68
University of Kentucky
-0.1080
University of Boulder Colorado
-0.2729
Kent State
-0.2729
Columbia
-0.4456
Washington State University
-0.5682
George Mason University
-0.7001
University of Florida
-0.8455
University of Tennessee
-1.0114
University of Illinois Urbana
-1.4766
University of Washington
-1.4766
University North Carolina Greensboro
-1.4766
Table 41. The Standardized Median g-index for 23 Universities
School
g-index
University California Berkeley
1.9287
Boston College
1.4766
Claremont Graduate University
1.2112
University of Connecticut
1.0114
Florida State University
0.8455
Western Michigan University
0.7001
Wayne State University
0.5059
Ohio State University
0.5059
University of Iowa
0.2729
Brigham Young
0.2720
University of Illinois Chicago
0.1080
University of Virginia
0.0000
University of Kentucky
-0.1080
University of Boulder Colorado
-0.2173
Kent State
-0.3293
Washington State University
-0.4456
George Mason University
-0.5682
Columbia
-0.7707
University of Tennessee
-0.7707
University of Florida
-1.0114
University of Illinois Urbana
-1.2112
University North Carolina Greensboro
-1.4766
University of Washington
-1.9287
Table 42. The Standardized Median Citations for 23 Universities
School
Citations
University of California Berkeley
1.9287
University of Connecticut
1.4766
69
Boston College
1.2112
Claremont Graduate University
1.0114
Florida State University
0.8455
University of Iowa
0.7001
University of Virginia
0.5682
Western Michigan University
0.4456
Ohio State University
0.3293
Wayne State University
0.2173
Brigham Young
0.1080
University of Illinois Chicago
0.0000
University of Kentucky
-0.1080
Kent State
-0.2173
George Mason University
-0.3293
University of Boulder Colorado
-0.4456
University of Washington
-0.5682
Washington State University
-0.7001
Columbia
-0.8455
University of Tennessee
-1.0114
University of Illinois Urbana
-1.2112
University of Florida
-1.4766
University North Carolina Greensboro
-1.9287
Table 43. The Standardized Median Publications for 23 Universities
School
Publications
Ohio State University
1.9287
University California Berkeley
1.4766
Wayne State University
1.2112
Claremont Graduate University
1.0114
University of Illinois Chicago
0.8455
Boston College
0.7001
Florida State University
0.5059
Western Michigan University
0.5059
University of Kentucky
0.3293
University of Connecticut
0.1624
Brigham Young
0.1624
University of Iowa
0.0000
University of Virginia
-0.1080
University of Boulder Colorado
-0.2173
George Mason University
-0.3293
Kent State
-0.4456
Washington State University
-0.5682
Columbia
-0.7001
70
University of Tennessee
-0.8455
University of Florida
-1.0114
University of Illinois Urbana
-1.2112
University North Carolina Greensboro
-1.4766
University of Washington
-1.9287
The overall standardized ranking is computed by computing the average Blom’s z score
across all the metrices. The results are compiled in the following tables, both for the mean and the
median of the metrics (Tables 44 45).
Table 44. Overall Standardized Mean
School
Mean h-index
University California Berkeley
1.6994
Claremont Graduate University
1.4766
Boston College
1.0834
Wayne State University
0.8427
University of Connecticut
0.7714
Florida State University
0.6801
Ohio State University
0.6198
Western Michigan University
0.5399
University of Iowa
0.3358
University of Boulder Colorado
0.2298
University of Virginia
0.0533
George Mason University
0.0270
University of Illinois Chicago
-0.0953
Brigham Young
-0.1087
University of Kentucky
-0.1384
Columbia
-0.4165
University of Florida
-0.4591
Washington State University
-0.7035
Kent State
-0.8506
University of Illinois Urbana
-0.9699
University of Washington
-1.2112
University North Carolina Greensboro
-1.6357
University of Tennessee
-1.7487
Table 45. Overall Standardized Median
School
Median g-index
University California Berkeley
1.8157
Claremont Graduate University
1.1777
71
Boston College
1.1498
University of Connecticut
0.8939
Ohio State University
0.8660
Wayne State University
0.7149
Florida State University
0.6459
Western Michigan University
0.5096
University of Iowa
0.3853
University of Illinois Chicago
0.2384
Brigham Young
0.1902
University of Virginia
0.1421
University of Kentucky
0.0013
University of Boulder Colorado
-0.2883
Kent State
-0.3163
George Mason University
-0.4817
Washington State University
-0.5705
Columbia
-0.6905
University of Tennessee
-0.9097
University of Florida
-1.0862
University of Illinois Urbana
-1.2776
University of Washington
-1.4756
University North Carolina Greensboro
-1.5896
In order to compute the final standardized ranking, one additional variable must be
considered, which is the size of the tenure/tenure track faculty. Generally, in the generic university
ranking system, factors such as the number of library volumes, laboratories, and faculty are
considered desirable, and ranked as such. In this case, however, the university faculty’s metric
pertains to their mean (median) productivity in their program. Although it is generally assumed
the more faculty the better students are served, that notion depends on the faculty student ratio,
and the ability of the faculty to provide high quality supervision to their students. The EER
program at WSU, for example, has graduated nearly 250 students since 1949, typically with
between 2.0 3.0 FTE. This rate far outpaces any of the other 23 universities based on the
graduation numbers per program as indicated at Dissertation Abstracts International. On this basis,
it was determined that the caliber of scholarly achievements must be tempered by the number of
72
faculty in the department. Table 46 display the school, faculty size, and the normal scores of the
23 universities.
Table 46. Raw, Inverse and the Normal Scores of Faculty Size.
Therefore, the inverse of the faculty size is included to create the final standardized
rankings (Tables 47 and 48) based on the mean in Table 44 and the median in Table 45.
Table 47. Final Standardized Ranking Based on the Mean
School
Mean
Rank
University California Berkeley
1.2743
1
Wayne State University
0.7810
2
Ohio State University
0.4893
3
Claremont Graduate University
0.4365
4
School
Faculty size
Inverse of faculty size
Normal scores
University California Berkeley
Claremont Graduate University
Boston College
Wayne State University
University of Connecticut
Florida State University
Western Michigan University
University of Iowa
Ohio State University
University of Virginia
George Mason University
University of Boulder Colorado
University of Kentucky
Brigham Young
University of Illinois Chicago
Columbia
University of Florida
Washington State University
Kent State
University of Illinois Urbana
University of Washington
UN Carolina Greensboro
University of Tennessee
2.00
9.00
10.00
3.00
7.00
5.00
7.00
10.00
5.00
7.00
14.00
5.00
3.00
7.00
6.00
6.00
7.00
8.00
5.00
10.00
4.00
8.00
4.00
.5000
.1111
.1000
.3333
.1429
.2000
.1429
.1000
.2000
.1429
.0714
.2000
.3333
.1429
.1667
.1667
.1429
.1250
.2000
.1000
.2500
.1250
.2500
1.9287
1.3322
1.3322
.9252
.9252
.5059
.5059
.5059
.5059
.1624
.1624
-.2173
-.2173
-.2173
-.2173
-.2173
-.6328
-.6328
-.8455
-1.2112
-1.2112
-1.2112
-1.9287
73
Boston College
0.4213
5
University of Connecticut
0.4150
6
Florida State University
0.3511
7
Western Michigan University
0.2770
8
University of Kentucky
0.2441
9
University of Illinois Chicago
0.1190
10
Brigham Young
0.1046
11
University of Virginia
0.0592
12
University of Iowa
0.0447
13
University of Boulder Colorado
-0.0952
14
Kent State
-0.0988
15
Columbia
-0.1630
16
University of Tennessee
-0.1723
17
Washington State University
-0.2261
18
George Mason University
-0.5649
19
University of Florida
-0.6748
20
University of Washington
-0.7143
21
University of Illinois Urbana
-1.8013
22
University North Carolina Greensboro
-1.4800
23
Table 48. Final Standardized Ranking Based on the Median.
School
Median
Rank
University California Berkeley
1.8722
1
Wayne State University
1.1131
2
Ohio State University
0.6668
3
Claremont Graduate University
0.6104
4
Boston College
0.4456
5
University of Connecticut
0.3243
6
Florida State University
0.3240
7
Western Michigan University
0.2143
8
University of Kentucky
0.1661
9
University of Illinois Chicago
0.0105
10
Brigham Young
-0.0307
11
University of Virginia
-0.0769
12
University of Iowa
-0.1596
13
University of Boulder Colorado
-0.1762
14
Kent State
-0.2019
15
Columbia
-0.2528
16
University of Tennessee
-0.3322
17
Washington State University
-0.3508
18
George Mason University
-0.3939
19
University of Florida
-0.6616
20
74
University of Washington
-1.8693
21
University of Illinois Urbana
-1.0542
22
University North Carolina Greensboro
-1.1487
23
4. What is the best estimate of a in
, EER or equivalent department/programs at the
twenty-three universities?
On the individual faculty level (i. e., ignoring university affiliation), the Pearson correlation
between the h index and the number of citations is 0.85 (p = .001, n = 152). The mean h index and
number of citations are 16.09 and 2740.13, respectively.
Setting a to 1.0 as was done in Sawilowsky (2012) yielded an h' = 26.17, which is a
large overcount from the expected 16.09. Hircsh (2005) recommended to set the constant “afrom
3 to 5. Setting a to higher values, however, will continue to inflate the h' over the actual h-index.
Reducing a” to .61 produces an h' = 15.96 On the basis of the EER type faculty at these 23
universities, therefore, the best estimate of “a” is .61. As discussed by Hirch (2005) and
Sawilowsky (2012), “a” is discipline specific.
The Pearson correlation is the slope of the simple linear regression between the h index
and the number of citations. A varieity of other curve estimations were considered. The only form
that produced a greater R
2
was the curvilinear cubic power curve (R
2
= .886, F = 385.690, df
1
,df
2
= 3, 148, p = 0.001). The cubic curve fit is depicted in Figure 1 below.
75
Figure 1. Cubic curve fit of the h-index to number of citations.
76
CHAPTER 5
Discussion
The aim of this dissertation was to compare the curriculum and faculty productivity of the
Educational Evaluation and Research type programs at 23 universities identified by the WSU
College of Education administration as being competitors to the EER program. In addition, the
opportunity was made available to study h', a quick and dirty approximation to the h-index, specific
for this discipline. The four research questions and conclusions are presented below.
1. How does the curriculum of the EER doctoral program at Wayne State University
compare with twenty-two comparable universities?
The first qualification to establish is the primary curricular maps obtained pertained to the
quantitative methods offerings at the competing universities. At WSU, for example, this is just one
of four thematic subdisciplines because the program gives considerable weight to (1) research and
experimental design, (2) psychometrics, (3) quantitative methods (i. e., applied statistics), and (4)
quantitative and qualitative program evaluation
As noted in Chapter 4, the applied statistics curricular maps are strikingly similary for all
competing universities. The general course map, irrespective of number of credit hours, was as
follows: (1) descriptive statistics including graphic indices (e. g., histograms, frequency
distributions) measures of central tendency and variability, bivariate analysis, and introduction to
inferential methods such as the z or t-test; (2) univariate inferential statistics (e. g., one-way
analysis of variance, analysis of variance, and analysis of covariance from either the sums of
squares or the general linear model perspective; (3) multivariate inferential statistics; and (4)
structural equation modeling. Then, a 5
th
course was generally offered reflecting the expertise of
the faculty, or the emphasis of the program. For example, at WSU, the additional courses were
77
nonparametric, permutation, exact, and robust statistics, and Monte Carlo methods. Other
university offerings substituted advanced sampling, hierarchical linear models, and Bayesian
methods.
In programs that have multiple disciplinary threads, such as WSU, topics such as advanced
sampling and hierarchical linear models are covered in the form of several lectures. In other
programs that have a greater concentration on applied statistics, these courses are given a full
semester treatment. Twelve of the 23 (52.2%) universities offered a full course treatment in
hierarchical linear modeling, and two (8.7%) offered a full course treatment in sampling. Wayne
State University was the only university to offer a full course treatment in nonparametric,
permutation, exact, and robust statistics, and Monte Carlo methods.
2. What are the h - index, g index, number of citations, and number of papers produced
by the faculty in the EER or equivalent departments/programs at the twenty three
universities?
In Table 49 below, the mean h-index and other faculty performance factors are presented.
In terms of the h-index, WSU is ranked fourth among the 23 universities.
Table 49. Mean h-index, g-index, Number of Citations, and Publications for 23 Universities.
School
Rank
h-index
g-index
Citations
Publications
Faculty size
University California Berkeley
1
38.5
95
14545.5
113.5
2.00
Claremont Graduate University
2
30.78
63.44
8420.78
116.89
9.00
Boston College
3
28.2
61.6
6381.9
97.4
10.00
Wayne State University
4
21.67
42.67
2438.67
125.33
3.00
University of Connecticut
5
21.29
46.71
5863.86
52
7.00
Florida State University
6
19.6
45.4
5761.8
55.2
5.00
Western Michigan University
7
18.43
37.71
2715.29
56
7.00
University of Iowa
8
18.1
38
2270.1
40.8
10.00
Ohio State University
9
17.6
32.4
1297.4
141
5.00
University of Virginia
10
16.43
31.71
1688.29
33.29
7.00
George Mason University
11
15.21
30.07
2240.86
36.07
14.00
University of Boulder Colorado
12
14.4
29.2
2454.6
60.4
5.00
University of Kentucky
13
14.33
30
1177
41.33
3.00
Brigham Young
14
14.29
31.29
1237.86
38.86
7.00
78
University of Illinois Chicago
15
12.67
27.67
1091.67
99.33
6.00
Columbia
16
12.5
22.83
1228.17
28.17
6.00
University of Florida
17
11.14
22
1590.43
25.14
7.00
Washington State University
18
10.13
20.75
1065
27.13
8.00
Kent State
19
8.8
18.4
1087.4
18.4
5.00
University of Illinois Urbana
20
8.7
17.2
901.6
20.2
10.00
University of Washington
21
8.5
14.25
719.5
15.5
4.00
University North Carolina Greensboro
22
6.75
10.38
405.13
11
8.00
University of Tennessee
23
6
11
224
11
4.00
3. What are the standardized rankings that captures the hierarchical rank of the twenty-
three EER or equivalent departments/programs?
The standardized mean Blom z score, based on all five faculty performance measures (i.
e., h-index, g-index, number of citations, number of publications, and tenure/tenure track faculty)
are compiled in Table 50 below. Wayne State University’s EER program is ranked 2
nd
among the
23 universities.
Table 50. Mean Standardized Ranking of h-index, g-index, Number of Citations, Number of
Publications and tenure/tenure track faculty.
School
Mean
Rank
University California Berkeley
1.2743
1
Wayne State University
0.7810
2
Ohio State University
0.4893
3
Claremont Graduate University
0.4365
4
Boston College
0.4213
5
University of Connecticut
0.4150
6
Florida State University
0.3511
7
Western Michigan University
0.2770
8
University of Kentucky
0.2441
9
University of Illinois Chicago
0.1190
10
Brigham Young
0.1046
11
University of Virginia
0.0592
12
University of Iowa
0.0447
13
University of Boulder Colorado
-0.0952
14
Kent State
-0.0988
15
Columbia
-0.1630
16
University of Tennessee
-0.1723
17
Washington State University
-0.2261
18
79
George Mason University
-0.5649
19
University of Florida
-0.6748
20
University of Washington
-0.7143
21
University of Illinois Urbana
-0.8013
22
University North Carolina Greensboro
-1.4800
23
4. What is the best estimate of a in 
, where hʹ is an estimate of the h-index,
for EER or equivalent department/programs at the twenty-three universities?
The best estimate of “awas .61, which produced an h' = 15.96, as compared with the h-
index of 16.09. This value of a” is specific to the EER discipline, and among the faculty at the 23
competing universities. A cubic curve presented the best fit, with R
2
= .886. Given the value "a"
appears to be discipline specific (e.g. Hirsch, 2005, Sawilowsky, 2012, this dissertation). Despite
the potential time saving and efficacy of h', it should be abandoned in favor of the more accurate
h-index.
Limitations
There are many caveats to this study. Sawilowsky (2012) noted many caveats with the h-
index and the g-index, as outlined in Chapter 3. In addition, a pilot was conducted where it was
attempted to obtain accurate h-indices for faculty members within the College of Education. There
was considerable recalcitrance, both on the part of the college admininistration of the university
union, by having faculty compute their h-indices, even via simple aids such as Publish or Perish
(PoP) software, or even to provide a draft PoP in an Excel format for the faculty to check.
Therefore, it was decided not to approach faculty at any of the 23 universities to verify the h-
indices. Instead, they were obtained via PoP, and entries that were out of date, failed to have an
exact author name match, or were outside expected EER content, were culled. If the faculty
80
member posted their c.v. to the official college webpage, then it was used to assist in verifying the
PoP results. Unfortunately, many c.v.’s were missing or out of date.
The curricular maps were also based on information publically available on the college’s
webpages. However, many webpages lacked specificity, and therefore, a request for the curricular
map was requested from the program coordinators of each program. Unfortunately, there was very
little response to the request for curricular maps.
The estimation of “a” in h', at .61, is quite suspect. Approximately 10% of the faculty had
an enormous number of citations, thus driving down the “a” value. A close inspection indicated
these faculty had best selling textbooks, or similar publications, that had very high numbers of
citations, as compared with typical journal articles.
Conclusion
An evaluation of an academic program is a complex undertaking. There are many formal
program evaluation methods, none of which were invoked in this dissertation because it was
beyond the scope of the study. Instead, the focus of this dissertation was faculty productivity
relative to the faculty size, which is just one element of program effectiveness.
For example, at WSU, the EER program was subjected to a number of sophisticated
program evaluation studies, such as Irwin (1960) who considered admissions criteria; Ozkan
(2008) who focused on EER’s curriculum development; White (2015) who canvassed current and
graduated students' opinion on reaching goals and objectives, preparedness for employment, and
impression of faculty effectiveness; and Carrol (2019) who focused on the role of statistical
software at an R1 institution.
The results of this study will be useful in recruitment of students to EER type doctoral
programs and helpful to faculty and administrators in comparing their program to competitors.
81
APPENDIX
The appendix contains historical information on the development of the WSU COE,
as well as other supporting tabular material.
EER program who currently, or previously, chaired an EER doctoral (Ed. D. or Ph.
D.) dissertation. The rank given is either the current or final rank.
1. Prof. Wilhelm “William” Reitz (Ph. D. in Education, University of Wisconsin -
Madison, 1930, “The intelligence of teachers: A study of scholastic aptitude in relation to
teaching success”). He was the inaugural member of the EER faculty and served from 1937
1973. His first mention in assisting with a doctoral dissertation was in 1946 (p. ii) for
Harold G. Silvius’ (who earlier joined the WSU COE faculty in 1941) Ed. D. in Industrial
Education conducted at The Pennsylvania State University. He chaired EER dissertations
from 1949 1973.
2. Prof. Charles L. Boye (Ph. D. in Botany, The Ohio State University, 1942, “A
genetic study of coleus”). He was a member of the EER faculty and chaired EER
dissertations from 1950 1952. He was subsequently given primary appointments in other
program areas.
3. Prof. Joseph W. Menge (Ph. D. in Education, University of Michigan, 1949, “An
experimental study of sampling procedures for the determination of achievement test
norms in a city school system”). He was a member of the EER faculty and chaired EER
dissertations in 1950. His primary appointment was subsequently changed to other program
areas and served as a COE Assistant Dean.
4. Prof. Claire C. Irwin (Ed. D. in Evaluation and Research, Wayne State
University, 1960, “The doctor of education program at Wayne State University: An
82
appraisal of institutional aims, recipient satisfaction, faculty evaluation, and dissertation
quality”). She was a student of Prof. Reitz. Her appointment start date in EER has not been
located, and she retired in 1988. In 1952, she served as Professor and in 1961 as Chair of
the Division of Arts and Sciences at Mercy College, Detroit. She began assisting on WSU
dissertations as a methodology resource as early as 1957 (an Ed. D. dissertation in Teacher
Education by Winnifred L. Fenton, 1957, p. 72, footnote 44, indicated Irwin was an
Advanced Graduate Fellow), and served as an Ed. D. committee member in 1967 (Verna
L. S. Hart, Special Education). She chaired EER dissertations from 1974 1988.
5. Prof. Donald R. Marcotte (Ph. D. in Education and Psychology, University of
Connecticut, 1969, “A computerized contingency analysis of graded essays”). He served
as a member of the EER faculty from 1969 2007 and EER Program Coordinator from at
least 1987 1996. His first mention of assisting on a dissertation was on the 1970 Ph. D.
in Guidance and Counseling by Roy F. H. Giroux (p. iii). He first served as an EER
dissertation committee member on the 1971 Ph. D. in English Education by Allan E.
Dittmer. He chaired EER dissertations from 1975 2007.
6. Prof. Barry S. Markman (Ph. D. in Experimental Psychology, Emory University,
1969, “Reinforcement density in multistimulus conditioned suppression”). Since 1972, he
has served as .5 FTE in EDP and .5 FTE in EER. He was Chair of EDP from 1985 2010,
and serves as the Director of the Learning and Instructional Sciences Program (EDP), and
has served as the Program Coordinator of EER from 2013 present. His primary doctoral
supervisory duties are in EDP, and only the EER dissertations he chaired are listed below.
7. Assistant Prof. Alan Klaas (Ph. D. in Statistics and Research Design, Southern
Illinois Carbondale, 1975, “Examination of homoscedasticity, rectilinearity, and
83
regression toward the mean in repeated administrations of a school aptitude test”). He
chaired EER dissertations in 1978, but in an email exchange on 10/17/2018 with Prof.
Sawilowsky, he was unable to provide further details (Sawilowsky, personal
communications).
8. Associate Prof. Maureen Shih-ping Sie (a/k/a, Maureen A. Sie; Ph. D. in
Guidance and Counseling Education, Iowa State University, 1969, “Pupil achievement in
an experimental nongraded elementary school”). She received tenure in 1977 and chaired
EER dissertations in 1979.
9. Prof. Martin J. Hogan (Ph. D. in Educational Evaluation and Research, Wayne
State University, 1970, “The effect of anxiety, stress, task difficulty and stage of learning
on performance in paired-associated learning”). He was a student of Prof. Reitz. He was a
member of the faculty of the Division of Educational Services in the WSU School of
Medicine. His is first mentioned assisting with an EER dissertation on Jorge A. Herrera’s
1979 Ph. D. in EDP. He chaired an EER dissertation in 1984 and in 1988.
10. Dr. Joseph L. Poach, Jr. (Ph. D. in Educational Evaluation and Research, 1976,
“Application of selected multivariate techniques to categorical and ordinal data: A specific
problem with carpal tunnel syndrome data”). He was a student of Prof. Marcotte. He was
first mentioned as providing assistance on a dissertation by Prof. Margaret M. J. Sosnowski
(Program Coordinator, Educational Leadership and Policy Studies, COE, WSU) in 1976,
and served as committee member on various dissertations until 1986. He chaired two EER
dissertations in 1982.
11. Prof. Shlomo Sawilowsky (Ph. D. in Measurement, Evaluation, and Research,
University of South Florida, 1985, “Robust and power analysis of the 2×2×2 ANOVA,
84
Rank Transformation, Random Normal Scores, and Expected Normal Scores
Transformation Tests”.) He joined the EER faculty in 1987, served as EER Program
Coordinator from 1996 2009, and Assistant Dean of the Divisions of Theoretical and
Behavioral Foundations, and Administration and Organizational Studies, from 2008
2010. He served on EER dissertations as a committee member beginning with Janice Y.
Stafford (Ph. D. in Instructional Technology) in 1990, as well as EER Co-Chair/2
nd
Advisor, and EER Cognate Advisor. He has chaired EER dissertations since 1993.
12. Associate Prof. Karen Tonso (Ph. D. in Foundations of Education, University
of Colorado Boulder, 1997, “Constructing engineers through practice: Gendered features
of learning and identity development”). She was .67 FTE in Educational History and
Philosophy, and .33 FTE (unofficially) in EER. She served on the faculty from 2006
2014. She served as a committee member on one EER dissertation and chaired nine EER
dissertations from 2002 - 2016, the last completed after her retirement.
13. Assistant Prof. Jasmine B. Ulmer (Ph. D. in Educational Leadership, University
of Florida, 2015, “Teacher leadership in the digital age: A critical policy analysis”). She
has served on the EER faculty since 2015. She is currently a committee member on several
doctoral dissertations.
COE Faculty Who Taught/Teach In EER and Serve(d) on EER Dissertation
Committees)
The list below is a chronological presentation of College of Education faculty who
have or had teaching appointments in EER and served on EER dissertation committees,
beyond those listed as serving as chair in the section above.
85
1. Professor Weimo Zhu (Ph. D in Physical Education specializing in Measurement
and Evaluation, University of Wisconsin Madison, 1990, “Appropriateness of the Rasch
Poisson model for psychomotor test scores”). He was an Associate Professor at Wayne
State University from 1990 1999. His primary appointment was in Kinesiology and
Health Sports Studies. He taught Item Response Theory as an EER Adjunct faculty
member. He served on one EER dissertation as a committee member.
2. Assistant Professor E. Whitney G. Moore. (Ph. D. in Health & Psychology of
Physical Activity, with a minor in Quantitative Psychology, University of Kansas, 2013,
“Examining the longitudinal effects of the PE class’ climate on students’ goal orientations
and intrinsic motivation to be physically active”). She joined the faculty of Kinesiology,
Health and Sports Studies in 2013. She teaches Multivariate Statistics, and Structural
Equation Models as an EER Adjunct faculty member. She is currently serving on one EER
dissertation committee.
Former EER Faculty, Including Those Who Served On EER Dissertation
Committees (Tenure/Track, Clinical Assistant Professor)
The list below is a chronological presentation of EER faculty who served on EER
committees, but either did not serve as chair or did not serve as chair of a dissertation to
fruition.
1. Associate Professor Lori F. Rothenberg. (Ph. D. in Educational Psychology with
a major in Quantitative Analysis, The University of North Carolina Chapel Hill, 1991,
“A study of the criterion validity of the North Carolina Teacher Performance Appraisal
instrument.”) She is first mentioned as providing assistance on a Curriculum and
Instruction dissertation in 1998 (Jacqueline D. Cassell, p. iii). She was an Assistant
86
Professor from 1996 - 1999, and served as an EER committee member on three EER
dissertations, including Dr. Gail F. Fahoome.
2. Assistant Professor (Clinical) Gail F. Fahoome. (Ph. D. in Educational
Evaluation and Research, Wayne State University, 1999, A Monte Carlo study of 21
nonparametric statistics with normal and nonnormal data.”) She was a student of Prof.
Sawilowsky. She served on the EER faculty from 2000 2013 and as EER Program
Coordinator from 2009 2013. In addition to serving as committee member, she also
served as second advisor on 17 EER dissertations.
3. Associate Professor Ben Kelcey (Ph. D. in Quantitative Methodologies and
Statistics in Education, University of Michigan School of Education and the Department
of Statistics, 2009, The role of covariate relationships in (1) optimizing the treatment
effect estimator in multilevel models using the propensity score and (2) assessing the
robustness of causal inference with applications to kindergarten retention and post-
secondary enrolment”). He was an Assistant Professor from 2009 2013 and as a
committee member on several dissertations.
4. Assistant Professor Ryoungon Park (Ph. D. in Quantitative Methods in
Educational Psychology, University of Texas Austin, 2015, “Investigating the impact of
a mixed-format item pool on optimal test designs for multistage testing”). He was an
Assistant Professor from 2015 2018, but did not serve on an EER doctoral dissertation.
EER Adjunct Instructors Who Serve(d) On EER Dissertation Committees
The following is a chronological presentation of EER adjunct faculty who serve or
have served on an EER doctoral committee, or as a second Advisor on an EER doctoral
87
committee. Although many have served in a faculty position, only those with an academic
appointment (outside of WSU) at the time of this writing are indicated.
1. Dr. Frank Castronova. He was a student of Prof. Marcotte.
2. Dr. Donna Coulter. She was a student of Assoc. Prof. Karen Tonso.
3. Assistant Professor (Kent State University) John Cuzzocrea. He was a student of
Prof. Sawilowsky.
4. Professor (Southern Illinois Carbondale) Todd Headrick. He served as second
advisor on three EER dissertations. He was a student of Prof. Sawilowsky.
5. Dr. Irwin Jopps. He was a student of Prof. Marcotte.
6. Dr. Michael Lance. He was a student of Prof. Sawilowsky.
7. Dr. (University of Winsor) Saverpierre Maggio. He was a student of Prof.
Sawilowsky.
8. Dr. Mary Montie. She was a student of Assoc. Prof. Tonso.
9. Dr. Elizabeth (Moen) McQuillen. In addition to serving as a committee member,
she served as 2
nd
Advisor on one EER dissertation. She was a student of Prof. Sawilowsky.
10. Assistant Professor (University of Phoenix) Michael Nanna (named in a
dissertation for providing assistance). He was a student of Prof. Sawilowsky.
11. Dr. Sarah (Rose) Raphan. She serves as a committee member, and served as
second advisor on one EER dissertation. She was a student of Prof. Markman.
12. Dr. Jack Sawilowsky. He served as second advisor on two EER dissertations.
He was a student of Prof. Markman.
13. Dr. Julie Smith. She served as second advisor on four EER dissertations. She
was a student of Prof. Sawilowsky.
88
14. Dr. Michele Fatal-Weber. She served as second advisor on one EER
dissertation. She was a student of Prof. Sawilowsky.
EER Graduates Who Have Served On EER Dissertation Committees
In this chronological presentation, EER doctoral graduates, who never taught as an
EER adjunct instructor, but served as committee member or seconf advisor are listed.
1. Associate Dean (WSU Medical School, and later Director of Medical Evaluation
and Assessment, University of Michigan Medical School) Patrick D. Bridge. In addition to
serving as a committee member, he served as second advisor on four EER dissertations.
He was a student of Prof. Sawilowsky.
2. Dr. Margaret Posch. She served as second advisor on two EER dissertations. She
was a student of Prof. Sawilowsky.
3. Dr. Boris Shulkin. In addition to serving as a Committee Member, he served as
second advisor on two EER dissertations. He was a student of Prof. Sawilowsky.
EER Dissertation Committee Membership
An analysis was conducted of the EER doctoral committee membership from 1988
present. Typically, Ph. D. committees consist of four members, although for a few dissertations
there were five members. Ed. D. committees usually consist of three members, although for a few
dissertations there were four members.
From 1988 2018, 50.4% of the doctoral committee members were EER or EER Adjunct
faculty, and 49.5% were from outside of EER. The make-up of the committees are shown in Table
50:
89
Table 51. Breakdown of Affiliation by Committee Member Position
Chair
2nd Member
3rd Member
4th Member
5th Member
EER
Non-
EER
EER
Non-
EER
EER
Non-
EER
EER
Non-
EER
EER
Non-
EER
100%
0%
60.20%
39.80%
22.60%
77.40%
1.90%
98.10%
0%
100%
Non-EER committee members represented 27 different affiliations at WSU and outside
WSU. The rank ordering in terms of the frequency of their affiliations from 1988 present were:
1. Division of Administration and Organizational Studies, COE, WSU
2. Division of Teacher Education, COE, WSU
3. School of Medicine
4. Department of Mathematics, College of Liberal Arts and Sciences, WSU
5. Division of Theoretical and Behavioral Foundations (excluding EER), COE,
WSU
6. Mike Ilitch School of Business, WSU
7. College of Nursing, WSU
8. Businesses, Government Agencies, Hospitals, Industry, Universities outside of
WSU
9. Other Colleges within WSU
10. Administrators at WSU
EER Doctoral Dissertations
A few comments and caveats are in order regarding the table (Table 51) of EER
dissertations below. As noted in the Wayne University Commencement of June 16, 1949,
Dr. Jacobs’ dissertation was titled, Measurement and Guidance in the Field of Public
Accounting.” The dissertation is not found on Proquest, and the chair of the committee
90
cannot be determined. It is presumed to have been Prof. Retiz, the inaugural member of
EER. Prof. Irwin was a student of Prof. Reitz, but in her (1960) dissertation on the history
of the Ed. D. in the COE she did not indicate Dr. Jacob’s as a major advisor.
Prof. Boye wrote approved by” and signed the cover page for the next two EER
Ed. D.’s, which were awarded in 1950 for Drs. Elmer W. McDaid and Arnold R. Meier.
However, the specialization for those two degrees were not indicated, nor were the
committee members listed. Prof. Boye’s subsequent dissertations in which he served as
chair indicated specializations apart from EER. However, Prof. Irwin (1960) indicated
there were 19 Ed. D.’s in EER conferred prior to hers, although there are 23 listed in the
table below. The number would almost be reconciled if Profs. Boye and Menge’s five
students, listed in the table (Table 51), were either (a) overlooked by Prof. Irwin (perhaps
because the Department of Education program area for the degree was not listed on the
signature page), or (b) if Profs. Boye and Menge were only listed as EER faculty in order
to support the petition for the EER Ph. D. authorization. They were not actually EER
faculty, and their five Ed. D.’s were conferred on candidates from other major program
areas in Education.
Profs. Reitz’s and Marcotte’s primary appointment were, and Prof. Sawilowsky’s
primary appointment is, in EER, and, therefore, the several dissertations they chaired for
other program areas/Divisions are listed. Profs. Boye’s and Menge’s appointments clearly
changed, so only those dissertations with no indication, or indicting specialization in EER,
are listed. Prof. Markman has a .5 FTE appointment with EER and with Educational
Psychology, so only his EER dissertations are listed. Assoc. Prof. Karen Tonso’s
appointment was unofficially .33 FTE in EER, so only her EER dissertations are listed.
91
Dr. Robert Jacob’s (Ed. D., 1949) dissertation is not available either at Proquest or
the EER Dissertation Repository. For unknown reasons, the dissertations for Drs. Jack Hill
(Ph. D., 2005), Boris Shulkin (Ph. D., 2005), and Sara Rose Raphan (Ph. D., 2016) do not
appear in Proquest. As expected, dissertations recently completed have yet to be posted in
Proquest. The .pdf documents for these dissertations held in the EER Dissertation
Repository are the advisors' copies at the time of the final dissertation oral defense.
The date of the signature page of the dissertation is noted in the table. In some cases,
e.g., dissertations for which the final defense was held in late December, the copyright was
obtained the following year. The student’s name, degree, and advisor are also indicated.
Additional graphics provide further analyses of the data on EER doctoral dissertations.
Table 52. EER Doctoral Dissertations, 1949 2019
Date
Student
Degree
Advisor
1949
Robert Jacobs
Ed. D.
1st Doctorate at WSU (Reitz)
1950
Elmer W. McDaid
Ed. D.
Charles L. Boye
1950
Arnold R. Meier
Ed. D.
Charles L. Boye
1951
Alice M. Davis
Ed. D.
Wilhelm Reitz
1952
James R. Irwin
Ed. D.
Wilhelm Reitz
1952
Gertrude Mauk
Ed. D.
Wilhelm Reitz
1952
Clarence W. Wachner
Ed. D.
Charles L. Boye
1953
Kristen D. Juul
Ed. D.
Wilhelm Reitz
1953
Duncan A. S. Pirie
Ed. D.
Wilhelm Reitz
1953
Lawrence H. J. Valade
Ed. D.
Wilhelm Reitz
1954
Wilhelmine L. Haley
Ed. D.
Wilhelm Reitz
1955
Allen L. Bernstein
Ed. D.
Wilhelm Reitz
1955
Charles C. Yarbrough
Ed. D.
Wilhelm Reitz
1956
Sophie V. Cheskie
Ed D.
Wilhelm Reitz
1956
Herman T. Lenser
Ed. D.
Wilhelm Reitz
1957
Benjamin E. Hatcher
Ed. D.
J. W. Menge
1957
Leffie L. Harris
Ed. D.
Wilhelm Reitz
1957
David A. Hilton
Ed. D.
Wilhelm Reitz
1957
Joseph E. Hill
Ed. D.
J. W. Menge
1958
L. Verdelle Clark
Ed. D.
Wilhelm Reitz
1958
Earl D. Sumner
Ed. D.
Wilhelm Reitz
1959
Marie S. Chmara
Ed. D.
Wilhelm Reitz
92
1959
David C. Magaw
Ed. D.
Wilhelm Reitz
1960
Claire C. Irwin
Ed. D.
Wilhelm Reitz
1961
Walter T. Pace
Ed. D.
Wilhelm Reitz
1962
George Farrah
Ed. D.
Wilhelm Reitz
1962
Clotildus M. Moran
Ed. D.
Wilhelm Reitz
1963
Lewis B. Larkin
Ed. D.
Wilhelm Reitz
1963
Ruth H. Sprague
Ph. D.
Wilhelm Reitz
1965
Paul B. Campbell
Ed. D.
Wilhelm Reitz
1966
Alice M. G. Smith
Ed. D.
Wilhelm Reitz
1966
Charlene R. Swarthout
Ed. D.
Wilhelm Reitz
1968
Gordon Taaffe
Ed. D.
Wilhelm Reitz
1969
Paula A. Dent
Ed. D.
Wilhelm Reitz
1969
Caroline M. Gillin
Ed. D.
Wilhelm Reitz
1969
Charles H. Held
Ph. D.
Wilhelm Reitz
1969
Lois J. Holland
Ph. D.
Wilhelm Reitz
1969
Charles P. Sheffieck
Ed. D.
Wilhelm Reitz
1970
Martin J. Hogan
Ph. D.
Wilhelm Reitz
1971
Leonard L. Jensen
Ph. D.
Wilhelm Reitz
1971
Thomas E. McCloud
Ph. D.
Wilhelm Reitz
1972
Lawrence A. Goldman
Ph. D.
Wilhelm Reitz
1972
Violet A. Sanders
Ph. D.
Wilhelm Reitz
1972
Sidney Selig
Ed. D.
Wilhelm Reitz
1972
Sara B. Wagner
Ed. D.
Wilhelm Reitz
1972
Edward N. Whitney
Ph. D.
Wilhelm Reitz
1973
Donald H. Cook
Ed. D.
Wilhelm Reitz
1973
Paul J. Gulyas
Ed. D.
Wilhelm Reitz
1974
Charles A. Green
Ph. D.
Claire C. Irwin
1974
Peter G. Manos
Ed. D.
Claire C. Irwin
1974
Barbara H. B. Wolfe
Ph. D.
Claire C. Irwin
1975
Mary A. P. Krammin
Ph. D.
Donald R. Marcotte
1975
Theresa Strand
Ph. D.
Donald R. Marcotte
1976
George A. W. Baker
Ed. D.
Claire C. Irwin
1976
Gerald R. Bergman
Ph. D.
Donald R. Marcotte
1976
Mary M. Blyth
Ed. D.
Claire C. Irwin
1976
Luc R. Moortgat
Ph. D.
Donald R. Marcotte
1976
Donald P. Mys
Ed. D.
Donald R. Marcotte
1976
Joseph L. Posch Jr.
Ph. D.
Donald R. Marcotte
1977
Gordon J. Blush
Ed. D.
Claire C. Irwin
1978
Edward S. Balian
Ph. D.
Alan Klaas
1978
Clarence W. Jan
Ph. D.
Alan Klaas
1978
John N. Miller
Ph. D.
Claire C. Irwin
93
1978
Sue M. Smock
Ph. D.
Donald R. Marcotte
1979
Janice L. Dreachslin
Ph. D.
Donald R. Marcotte
1979
Eileen E. Hitchingham
Ph.D.
Maureen Sie
1979
Lillian R. Hurwitz
Ph. D.
Donald R. Marcotte
1979
Anne P. Jaworski
Ph. D.
Maureen Sie
1981
Hsing S. Chen
Ph. D.
Claire C. Irwin
1981
Lindson Feun
Ph. D.
Donald R. Marcotte
1982
Adger Butler
Ph. D.
Donald R. Marcotte
1982
Gary J. Clor
Ph.D.
Joseph L. Posch Jr.
1982
Walter H. Mackey II
Ph. D.
Joseph L. Posch Jr.
1983
John Abdalla
Ph. D.
Donald R. Marcotte
1983
Eitedal M. B. Andary
Ph. D.
Claire C. Irwin
1983
Donald J. McPherson
Ph. D.
Donald R. Marcotte
1983
JoAnne E. Moore
Ph. D.
Donald R. Marcotte
1983
Marilyn T. Wayland
Ph. D.
Donald R. Marcotte
1984
Norman C. Irish
Ph. D.
Donald R. Marcotte
1984
Terry L. Rudolph
Ph. D.
Donald R. Marcotte
1984
Patricia M. Scalzi
Ph. D.
Martin J. Hogan
1984
Mary R. S. Vidaurri
Ph. D.
Donald R. Marcotte
1984
Moon-Ja Yoon
Ph. D.
Claire C. Irwin
1985
Dennis R. Wisniewski
Ph. D.
Donald R. Marcotte
1986
Richard J. Meltzer
Ph. D.
Donald R. Marcotte
1986
Douglas L. Wood
Ph. D.
Claire C. Irwin
1987
Michael P. O’Leary
Ph. D.
Donald R. Marcotte
1988
Nancy K. Owens
Ph. D.
Donald R. Marcotte
1988
Bette L. Reynolds
Ph. D.
Claire C. Irwin
1988
Shedrick E. Ward
Ph. D.
Martin J. Hogan
1989
Michael R. Swope
Ph. D.
Donald R. Marcotte
1990
Peggy J. Labelle
Ph. D.
Donald R. Marcotte
1990
Mary P. Lange
Ph. D.
Donald R. Marcotte
1990
Christine K. Stephens
Ph. D.
Donald R. Marcotte
1991
Bruce C. Deighton
Ph. D.
Donald R. Marcotte
1991
Eula M. Spann-Kirk
Ed. D.
Donald R. Marcotte
1992
Queen B. Loundmon
Ph. D.
Donald R. Marcotte
1992
Thomas P. Regan
Ph. D.
Donald R. Marcotte
1992
Jerome Shepard
Ed. D.
Donald R. Marcotte
1992
Joseph E. Sucher
Ph. D.
Donald R. Marcotte
1992
Robert C. West
Ph. D.
Donald R. Marcotte
1993
TaiHyong Moon
Ph. D.
Donald R. Marcotte
1993
Joyce A. Washington
Ed. D.
Shlomo S. Sawilowsky
1994
Kim E. Edwards-Brown
Ed. D.
Donald R. Marcotte
94
1994
Karen Germayne
Ed. D.
Donald R. Marcotte
1994
Heather M. Gunderson
Ed. D.
Donald R. Marcotte
1994
Sharonlyn G. Harrison
Ph. D.
Shlomo S. Sawilowsky
1994
Timmy D. Johnson
Ed. D.
Donald R. Marcotte
1994
Deborah L. Kelley
Ph. D.
Shlomo S. Sawilowsky
1995
Alexander J. Depetro
Ph. D.
Donald R. Marcotte
1995
Dennis J. Mullan
Ph. D.
Shlomo S. Sawilowsky
1996
Patrick D. Bridge
Ph. D.
Shlomo S. Sawilowsky
1996
Mary J. Heaney
Ph. D.
Donald R. Marcotte
1996
Margaret A. Posch
Ph. D.
Shlomo S. Sawilowsky
1996
Uju P. Eke
Ph. D.
Shlomo S. Sawilowsky
1997
Frank C. Castronova
Ph. D.
Donald R. Marcotte
1997
Thilak W. Gunasekera
Ph. D.
Shlomo S. Sawilowsky
1997
Todd C. Headrick
Ph. D.
Shlomo S. Sawilowsky
1997
Michael J. Nanna
Ph. D.
Shlomo S. Sawilowksy
1997
David C. Odett
Ph. D.
Donald R. Marcotte
1997
Ronald L. Thomas
Ph. D.
Donald R. Marcotte
1998
Anil N. F. Aranha
Ph. D.
Shlomo S. Sawilowsky
1998
William Cade
Ph. D.
Shlomo S. Sawilowsky
1999
Michael Wolf-Branigin
Ph. D.
Shlomo S. Sawilowsky
1999
Cynthia L. Creighton
Ph. D.
Shlomo S. Sawilowsky
1999
Gail F. Fahoome
Ph. D.
Shlomo S. Sawilowsky
1999
Joseph L. Musial III
Ph. D.
Shlomo S. Sawilowsky
2000
Kathleen Cross
Ph. D.
Donald R. Marcotte
2000
Edna E. Jackson-Gray
Ph. D.
Donald R. Marcotte
2000
James A. Gullen
Ph. D.
Shlomo S. Sawilowsky
2000
Juanita M. Lyons
Ph. D.
Shlomo S. Sawilowsky
2000
Frederick F. Strale Jr.
Ph. D.
Shlomo S. Sawilowsky
2001
Scott Compton
Ph. D.
Shlomo S. Sawilowsky
2001
Karen Crawforth
Ph. D.
Shlomo S. Sawilowsky
2001
Kathleen R. Peterson
Ph. D.
Shlomo S. Sawilowsky
2002
Jacqueline Drouin
Ph. D.
Donald R. Marcotte
2002
Lina M. J. Enayah
Ed. D.
Donald R. Marcotte
2002
Mary A. Golinski
Ph. D.
Karen L. Tonso
2002
Rimma Novojenova
Ed. D.
Shlomo S. Sawilowsky
2002
Hamid M. W. Siddiqui
Ph. D.
Donald R. Marcotte
2003
Jennifer M. Bunner
Ph. D.
Shlomo S. Sawilowsky
2003
Aloha A. Van Camp
Ph. D.
Donald R. Marcotte
2003
Bruce R. Fay
Ph. D.
Shlomo S. Sawilowsky
2003
Karen Lee
Ph. D.
Shlomo S. Sawilowsky
2003
Regina Pierce
Ph. D.
Donald R. Marcotte
95
2004
Jelani Jabari
Ph. D.
Donald R. Marcotte
2004
Stephanie Krol-Jersevic
Ed. D.
Shlomo S. Sawilowsky
2005
Marquita L. Betts-Field
Ed. D.
Donald R. Marcotte
2005
Hesham F. Gadelrab
Ph. D.
Donald R. Marcotte
2005
Werner D. Gottwald
Ph. D.
Donald R. Marcotte
2005
Jack Hill
Ph. D.
Shlomo S. Sawilowsky
2005
Irwin Jopps
Ph. D.
Donald R. Marcotte
2005
Boris Shulkin
Ph. D.
Shlomo S. Sawilowsky
2006
Amittai Benami
Ed. D.
Shlomo S. Sawilowsky
2006
Saydee J. Mends-Cole
Ed. D.
Shlomo S. Sawilowsky
2006
Kalvin Holt
Ed. D.
Shlomo S. Sawilowsky
2006
Jude Inweregbu
Ph. D.
Donald R. Marcotte
2006
Kevin D. Lawson
Ph. D.
Shlomo S. Sawilowsky
2006
Patricia A. Pelavin
Ph. D.
Shlomo S. Sawilowsky
2006
Andrew J. Tierman
Ph. D.
Shlomo S. Sawilowsky
2006
Michele Weber
Ph. D.
Shlomo S. Sawilowsky
2007
Tana J. Bridge
Ph. D.
Shlomo S. Sawilowsky
2007
John L. Cuzzocrea
Ph. D.
Shlomo S. Sawilowsky
2007
David A. Fluharty
Ph. D.
Shlomo S. Sawilowsky
2007
Candice L. Pickens
Ed. D.
Shlomo S. Sawilowsky
2007
Lori Roy
Ph. D.
Shlomo S. Sawilowsky
2007
Andreé A. Sampson
Ed. D.
Shlomo S. Sawilowsky
2007
Reza Ziaee
Ph. D.
Shlomo S. Sawilowsky
2008
Roberta A. Foust
Ed. D.
Shlomo S. Sawilowsky
2008
Gregory K. Karapetian
Ph. D.
Shlomo S. Sawilowsky
2008
Bulent Ozkan
Ph. D.
Shlomo S. Sawilowsky
2008
Shira R. Solomon
Ph. D.
Shlomo S. Sawilowsky
2010
Frances Dolley
Ed. D.
Shlomo S. Sawilowsky
2010
Piper A. Farrell-Singleton
Ph.D.
Shlomo S. Sawilowsky
2010
Stephanie D. Wren
Ed. D.
Shlomo S. Sawilowsky
2011
Linda F. Ellington
Ph. D.
Shlomo S. Sawilowsky
2011
Michael W. Lance
Ph. D.
Shlomo S. Sawilowsky
2011
Julie M. Smith
Ph. D.
Shlomo S. Sawilowsky
2011
Sibyl Y. St._Clair
Ph. D.
Karen L. Tonso
2011
Barbara S. Wisniewski
Ph. D.
Karen L. Tonso
2012
Jeff A. Capobianco
Ph. D.
Shlomo S. Sawilowsky
2012
Marvin Gibbs
Ph. D.
Shlomo S. Sawilowsky
2012
Norman N. Haidous
Ph. D.
Shlomo S. Sawilowsky
2012
Saverpierre Maggio
Ph. D.
Shlomo S. Sawilowsky
2012
Jason Parrott
Ph. D.
Shlomo S. Sawilowsky
2013
Donna M. Coulter
Ph. D.
Karen L. Tonso
96
2013
Jamie H. Gleason
Ph. D.
Shlomo S. Sawilowsky
2013
Valerie Felder
Ph. D.
Shlomo S. Sawilowsky
2013
Deborah N. Mills
Ph. D.
Karen L. Tonso
2013
Marie L. Montie
Ph. D.
Karen L. Tonso
2013
Daryle A. Olson
Ph. D.
Shlomo S. Sawilowsky
2013
Rosalind Reaves
Ph. D.
Karen L. Tonso
2014
Rasha E. Elhage
Ph. D.
Shlomo S. Sawilowsky
2014
Tanina S. Foster
Ph. D.
Shlomo S. Sawilowsky
2014
Anna C. Gersh
Ph. D.
Shlomo S. Sawilowsky
2014
Carla Howe
Ph. D.
Shlomo S. Sawilowsky
2014
Akiva J. Lorenz
Ph. D.
Barry S. Markman
2014
Jack Sawilowsky
Ph. D.
Barry S. Markman
2014
Joe L. Smith
Ed. D.
Shlomo S. Sawilowsky
2014
Joyce C.-Y. Wu
Ph. D.
Karen L. Tonso
2015
Linda C. Lowenstein
Ph. D.
Shlomo S. Sawilowsky
2015
Willie L. White II
Ed. D.
Shlomo S. Sawilowsky
2016
Yolanda E. Bloodsaw
PH. D.
Karen L. Tonso
2016
Zsa-Zsa Booker
Ph. D.
Shlomo S. Sawilowsky
2016
Holly A. Child
Ph. D.
Shlomo S. Sawilowsky
2016
Natosha N. Floyd
Ed. D.
Shlomo S. Sawilowsky
2016
Tammy A. Grace
Ph. D.
Shlomo S. Sawilowsky
2016
Zora Injic
Ph. D.
Shlomo S. Sawilowsky
2016
Andrea King-Jimenez
Ed. D.
Shlomo S. Sawilowsky
2016
Dong Li
Ph. D.
Shlomo S. Sawilowsky
2016
Elizabeth (Moen) McQuillian
Ph. D.
Shlomo S. Sawilowsky
2016
Timberly Robinson
Ed. D.
Shlomo S. Sawilowsky
2016
Sarah (Rose) Raphan
Ph. D.
Barry S. Markman
2016
Heatherlun S. Uphold
Ph. D.
Shlomo S. Sawilowsky
2016
Rachelle Warren
Ed. D.
Shlomo S. Sawilowsky
2016
Hong Ye
Ph. D.
Shlomo S. Sawilowsky
2017
Mona Alamri
Ph. D.
Shlomo S. Sawilowsky
2017
Ayed Almoid
Ph. D.
Shlomo S. Sawilowsky
2018
Mohammed E. Al Almohazi
Ph. D.
Shlomo S. Sawilowsky
2018
Susan Harden
Ph. D.
Shlomo S. Sawilowsky
2018
Maurice Kavanaugh
Ed. D.
Shlomo S. Sawilowsky
2018
Mona King
Ed. D.
Shlomo S. Sawilowsky
2018
Kimberly Kleinhans
Ph. D.
Shlomo S. Sawilowsky
2018
Ingrid Macon
Ph. D.
Shlomo S. Sawilowsky
2019
Tia Bosley
Ph.D.
Shlomo S. Sawilowsky
2019
Kevin Carroll
Ph. D.
Shlomo S. Sawilowsky
2019
Ahmad Farooqi
Ph. D.
Shlomo S. Sawilowsky
97
2019
Darryl Gardner
Ph. D.
Shlomo S. Sawilowsky
2019
Christine Lewis
Ph. D.
Shlomo S. Sawilowsky
2019
Dustin Saalman*
Ph. D.
Shlomo S. Sawilowsky
2019
Susan Talley
Ph. D.
Shlomo S. Sawilowsky
*Note. The final defense was conducted in December, 2019, although the commencement date is
May, 2020.
There were N = 231 doctoral degrees awarded from 1949 December, 2019. The
breakdown by advisor is compiled in the Table 3 below.
Table 53. EER Advisor by Number of Dissertations
Advisor
N
%
Boye
3
1.30
Hogan
2
0.87
Irwin
12
5.19
Klaas
2
0.87
Marcotte
56
24.24
Markman
3
1.30
Menge
2
0.87
Posch
2
0.87
Reitz
43
18.61
Sawilowsky
93
41.14
Sie
2
0.87
Tonso
9
3.90
From 1949 to May, 2019, n = 71 (30.74%) of the EER dissertations were Ed. D.’s, and n
= 160 (69.26%) were Ph. D.’s. The breakdown of Ed. D. and Ph. D. by dissertation advisor is
compiled in Table 4 below.
Table 54. EER Advisor by Doctoral Degree Type
Ed. D.
Ph.D.
Total
Boye
3
0
3
Hogan
0
2
2
Irwin
4
8
12
Klaas
0
2
2
Marcotte
9
47
56
Markman
0
3
3
Menge
2
0
2
98
Posch
0
2
2
Reitz
34
9
43
Sawilowsky
19
76
96
Sie
0
2
2
Tonso
0
9
9
Total
71
160
231
The mean (median) number of EER dissertations the 71 year period from 1949 2019 was
3.25 (3) per year (standard deviation = 2.3), and the maximum was n = 14. The number of
dissertations by year, with a linear trend line superimposed, is depicted in the figure below. The
trend is statistically significantly increasing (Mann-Kendall tau = x, 2-sided p = .x). The trend line
in presented in Figure 2.
Figure 2. Number of EER Dissertations from 1949 2019
0
2
4
6
8
10
12
14
1949
1952
1955
1958
1961
1964
1967
1970
1973
1976
1979
1982
1985
1988
1991
1994
1997
2000
2003
2006
2009
2012
2015
2018
EER Dissertations by Year
and Trend Line
99
REFERENCES
Carroll, K. C. (2019). Analysis of software usage by an r1 university’s education faculty,
administrators, and academic staff [Unpublished dissertation]. Wayne State University.
Harzing, A. W. (2020, November 12). Publish or perish. Research in International Management.
https://harzing.com/resources/publish-or-perish
Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. PNAS,
102(46), 16569-16572.
Hogg, R. V. (1991). Statistical education: Improvements are badly needed. The American
Statistician, 45, 342-343.
Irwin, C. C. (1960). The doctor of education program at Wayne State University: An appraisal of
institutional aims, recipient satisfaction, faculty evaluation, and dissertation
quality [Doctoral dissertation, Wayne State University].
Ii, W., & Willie, L. (2015). An evaluation of Wayne State University's educational evaluation and
research program [Doctoral dissertation, Wayne State University].
Ioannidis, J. P., Patsopoulos, N. A., Kavvoura, F. K., Tatsioni, A., Evangelou, E., Kouri, I., ... &
Liberopoulos, G. (2007). International ranking systems for universities and institutions: a
critical appraisal. BMC medicine, 5(1), 30.
Lomax, M.L. (January 1, 2015). A proposed federal college rating system could hurt
disadvantaged students. The Washington Post,
https://www.washingtonpost.com/opinions/a-proposed-federal-college-rating-system-
could-hurt-disadvantaged-students/2015/01/01/572b50a8-9112-11e4-a900-
9960214d4cd7_story.html. (retrieved 1/19/2020).
100
McGaghie, W. C., & Thompson, J. A. (2001). America's best medical schools: A critique of the
U. S. News & World Report Rankings. Academic Medicine, 76(10),985-992
Nelson, A. T. (1953). The selection of doctoral candidates: An investigation of the relative
predictive value of the factors involved in the doctor of education certification policies in
the department of educational administration at Teachers College, Columbia
University [Doctoral dissertation, Columbia University].
Ozkan, B. (2008). Comparison of university researchers' and statistical consultants' diagnoses
and applications on research problems [Doctoral dissertation, Wayne State University].
Sawilowsky, S. S. (2012). S-index: A comprehensive scholar impact index. International Review
of Social Sciences and Humanities, 3(1), 85-95.
Sawilowsky, S. S. (2018). EER student handbook.
coe.wayne.edu/tbf/eer/eer_student_handbook_v_8b_revisedmay232018.pdf. (Retrieved
12 / 24 / 2019).
Solomon, S. R., & Sawilowsky, S. S. (2009). Impact of rank-based normalizing transformations
on the accuracy of test scores. Journal of Modern Applied Statistical Methods, 8(2), 448-
462.
Van Raan, A. F. (2005, June). Challenges in ranking of universities. In Invited paper for the First
International Conference on World Class Universities, Shanghai Jaio Tong University,
Shanghai (pp. 133-143).
101
ABSTRACT
RANKING SELECTED R1 UNIVERSITY DOCTORAL
QUANTITATIVE METHODOLOGY PROGRAMS
by
Gabriel Attar
May 2021
Advisor: Dr. Shlomo Sawilowsky
Major: Educational Evaluation and Research
Degree: Doctor of Education
The historical development of the EER program, from its initiative as WSU’s first Ed. D.,
to its growth with the Ph. D. and master’s program, was well documented (Irwin, 1960). The
importance of the EER doctoral program was established, in terms of its role in the COE, within
WSU, and in the outside business and industry communities (Ozkan, 2008). The ascension and
support of non-EER administrators and faculty into statistical and related computed was
investigated (Carroll, 2019). A program evaluation established the high level of satisfaction of
graduates regarding their EER doctoral experience (White, 2015). The purpose of the current study
is to expand beyond WSU and consider the ranking of WSU’s EER program as compared with
twenty-two universities designated by a former COE dean as being comparable to WSU.
Hence, the first purpose of this dissertation is to compare the curriculum of the WSU EER
Department at Wayne State University with the other 22 universites. The second step is to compare
the research and publications of the faculties at these universities with those of the faculty in EER.
The third step is to develop a statistically sound ranking formula regarding the scholarship of the
faculties of these universities
102
AUTOBIOGRAPHICAL STATEMENT
Gabriel Attar
Professional Experience
1999-2020 Math and science Teacher Detroit, MI
Education
2015-2021 Ph.D Candidate Educational Evaluation and Research
Wayne State University, Detroit, MI
2000-2004 M.A Master in Arts in Teaching
College of Education, Wayne State
University, Detroit, MI
1998-1999 Teaching Cetrificate College of Education,
Wayne State University, Detroit, MI
1995-1998 B.S Bachelor of Science in Biological
Science, College of Sciencs,
Wayne State University, Detroit, MI
1992-1995 Associate in Science Oakland Community College,
Roayl Oak, MI