Abstract
In psychometric sciences, a common problem is the choice of a good response\r\nscale. Every scale has, by its nature, a propensity to lead a respondent to\r\nmainly positive -or negative- ratings. This paper investigates possible causes of the\r\ndiscordance between two ordinal scales evaluating the same goods or services. In\r\npsychometric literature, Cohen’s Kappa is one of the most important index to evaluate\r\nthe strength of agreement, or disagreement, between two nominal variables, in\r\nparticular in its weighted version. In this paper, a new index is proposed. A proper\r\nprocedure to determine the lower and upper triangle in a non-square table is also\r\nimplemented, as to generalize the index in order to compare two scales with a different\r\nnumber of categories. A test is set up with the aim to verify the tendency of\r\na scale to have a different rating compared to a different one. A study with real data\r\nis conducted.
Lingua originale | Inglese |
---|---|
Titolo della pubblicazione ospite | Studies in Classification, Data Analysis, and Knowledge Organization |
Editore | Springer |
Pagine | 55-63 |
Numero di pagine | 9 |
ISBN (stampa) | 978-3-319-06691-2 |
DOI | |
Stato di pubblicazione | Pubblicato - 2014 |
All Science Journal Classification (ASJC) codes
- Informatica Applicata
- Sistemi Informativi
- Sistemi Informativi e Gestione dell’Informazione
- Analisi
Keywords
- Agreement Index
- Measurement Ordinal Scale
- Weighted Kappa