[AISWorld] H-Index for MIS - 2017

mmora at securenym.net mmora at securenym.net
Tue Feb 21 22:29:55 EST 2017


Dear colleagues in AIS:
Interesting and relevant issue for measuring productivity in IS research.
But, from other viewpoints, what about the concept of provided value and
what about on the claims done by top senior researchers on the excessive
rigor for being published in our discipline (and being tenured for
Canadian and USA colleagues in these countries)? The claims were published
at:


Dennis, A. R., Valacich, J. S., Fuller, M. A., & Schneider, C. (2006).
Research standards for promotion and tenure in information systems. MIS
Quarterly, 30(1), 1-12.

The conclusions were:

"We believe that the IS discipline currently is on a dangerous course,
at least at U.S. and Canadian universities. Less than three
individuals (2.1 percent) out of every graduating Ph.D. class of
about 135 individuals who seek employment at U.S. or Canadian
universities will produce enough elite journal articles to receive
promotion and tenure at a typical research extensive university using
current standards. Therefore, we call on promotion and tenure
committees to rethink what research standards are appropriate for IS
faculty members at their institutions, given our data. We believe that
we need to use new, lower, research standards for promotion and
tenure, standards more akin to the standards used in Accounting" (idem,
2006; p. 8).

Manuel Mora

--------------------------------------------------------------------------
Manuel Mora, EngD.
Full-time Professor and Researcher Level C
ACM Senior Member / SNI Level I
Department of Information Systems
Autonomous University of Aguascalientes
Ave. Universidad 940
Aguascalientes, AGS
Mexico, 20131
http://x3620a-labdc.uaa.mx:8080/web/drmora
 --------------------------------------------------------------------------

On Tue, February 21, 2017 8:32 pm, Michael Cuellar wrote:
> It is important that we continue to develop improved measures and
> methodologies for assessing scholarly impact. And therefore, I think itâ€
> s great that University of Arizona sought to create an automated method
> of computing the h-index. It partially automates the work of
> disambiguating the names of scholars. Until we get to such time as the
> use of ORCID (orcid.org) or equivalent is common in publication, this is
> a needed contribution. I hope that they publish this and/or include it in
> an R package. Also, they use Google Scholar as the database which as they
> indicate has more coverage than other database. While somewhat dirty, the
> greater coverage makes it a good data source for this type of analysis.
>
> That said, we need to be careful in publishing numbers such as this. The
> use of single number evaluations is highly problematic. This is
> especially the case with the h-index. Simply using the one number can be
> highly misleading if used for evaluating scholars. The h-index itself,
> while a good omnibus measure of ideational influence, is biased in
> several ways. First it is insensitive to citations beyond the h number.
> Once you get beyond h, it doesn’t matter if a paper has 10 or 2000
> citations. Second, it is biased in favor of age of the paper. Since the
> h-index does not decline over time, the longer a paper has been out, the
> more time it has to pick up citations and thereby increase the value of
> the index (Truex III et al. 2011). Third, it ignores the contribution of
> co-authors. A solo-authored paper counts as much as a paper with 500
> authors to each author. Finally, it is biased in favor of publication
> sources that produce more papers. The more papers a source produces, the
> more opportunity the source has to pick up citations.
>
> For these reasons, when we developed the Scholarly Capital Model (Cuellar
> et al. 2016), we proposed to use three h-indices: the standard h, the
> g-index (Egghe 2006) which corrects for highly cited papers and the
> hc-index (Sidiropoulos et al. 2006) which corrects for recency. We
> believed that this provides a more rounded perspective on ideational
> influence than simply using the h-index. To consider co-authors, you
> might want to consider using the hi, norm index (Harzing 2016) which
> takes co-authorship count into consideration. For publication sources
> with large production such as journals, using the hm-index might be
> warranted (Molinari and Molinari 2008).
>
> And again, the h-family of indices captures only the idea of ideational
> influence, the uptake of a scholar’s ideas by the field. It doesn’t
> consider spreading influence by other means such as connectedness or
> venue representation. To get a real idea of what a scholar brings to the
> table, you need to consider these other ideas using social network
> analysis tools such as we did in developing the SCM.
>
> Also, the idea of using such lists as those of the senior scholars while
> useful as a demonstration probably needs to be upgraded by an automated
> method of identifying the field. We proposed using the methodology
> describe in Mingers and Leydesdorff (2014). That would probably be an
> upgrade to the methodology describe in this paper.
>
> I hope they continue to advanced their efforts by developing more
> measures. It would be great to see an SCM calculator created using the
> Mingers and Leydesdorff methodology and the addition of additional
> statistics such as eigenvector centrality and hi, norm.
>
> Cuellar, M. J., Takeda, H., Vidgen, R., and Truex III, D. P. 2016.
> "Ideational Influence, Connectedness, and Venue Representation: Making an
> Assessment of Scholarly Capital," Journal of the Association for
> Information Systems (17:1), pp. 1-28.
>
>
> Egghe, L. 2006. "Theory and Practice of the G-Index," Scientometrics
> (69:1), pp. 131-152.
>
>
> Harzing, A.-W. 2016. "Publish or Perish." Harzing, p. This software
> extracts data from Google Scholar and produces a list of citing articles
> with a number of citation indices.
>
> Mingers, J., and Leydesdorff, L. 2014. "Identifying Research Fields
> within Business and Management: A Journal Cross-Citation Analysis,"
> Journal of the Operational Research Society (66), pp. 1370-1384.
>
>
> Molinari, J.-F., and Molinari, A. 2008. "A New Methodology for Ranking
> Scientific Institutions," Scientometrics (75:1), pp. 163-174.
>
>
> Sidiropoulos, A., Katsaros, D., and Manolopoulos, Y. 2006. "Generalized
> H-Index for Disclosing Latent Facts in Citation Networks,"
> arXiv:cs.DL/o606066 (1).
>
>
> Truex III, D. P., Cuellar, M. J., Takeda, H., and Vidgen, R. 2011. "The
> Scholarly Influence of Heinz Klein: Ideational and Social Measures of His
> Impact on Is Research and Is Scholars," European Journal of Information
> Systems (20:4).
>
>
>> On Feb 21, 2017, at 2:02 PM, sandeep suntwal
>> <sandeepsuntwal at email.arizona.edu> wrote:
>>
>>
>> Dear All,
>>
>>
>> Please find attached the H-Index list for MIS for 2017. Please feel
>> free to reach out to me as per the instructions in the document in case
>> of any comments, queries or suggestions.
>>
>>
>> Thank You and Regards,
>> Sandeep Suntwal
>> AI Lab - Eller College of Management
>> University of Arizona
>> <h-index_mis_2017.pdf>_______________________________________________
>> AISWorld mailing list
>> AISWorld at lists.aisnet.org
>>
>
> _______________________________________________
> AISWorld mailing list
> AISWorld at lists.aisnet.org
>
>






More information about the AISWorld mailing list