Additional Information
Book Details
Abstract
Understanding Statistics in Psychology with SPSS 7th edition, offers students a trusted, straightforward, and engaging way of learning how to carry out statistical analyses and use SPSS with confidence.
Comprehensive and practical, the text is organised by short, accessible chapters, making it the ideal text for undergraduate psychology students needing to get to grips with Statistics in class or independently.
Clear diagrams and full colour screenshots from SPSS make the text suitable for beginners while the broad coverage of topics ensures that students can continue to use it as they progress to more advanced techniques.
Key features
· Now combines coverage of statistics with full guidance on how to use SPSS to analyse data
· Suitable for use with all versions of SPSS
· Examples from a wide range of real psychological studies illustrate how statistical techniques are used in practice
· Includes clear and detailed guidance on choosing tests, interpreting findings and reporting and writing up research
· Student focused pedagogical approach including
o Key concept boxes detailing important terms
o Focus on sections exploring complex topics in greater depth
o ‘Explaining statistics sections clarify important statistical concepts’.
Table of Contents
Section Title | Page | Action | Price |
---|---|---|---|
Cover\r | Cover | ||
Brief Contents\r | v | ||
Contents\r | vii | ||
Guided tour | xx | ||
Introduction | xxv | ||
Acknowledgements | xxvii | ||
1 Why statistics? | 1 | ||
Overview | 1 | ||
1.1 Introduction | 2 | ||
1.2 Research on learning statistics | 3 | ||
1.3 What makes learning statistics difficult? | 4 | ||
1.4 Positive about statistics | 6 | ||
1.5 What statistics doesn’t do | 9 | ||
1.6 Easing the way | 10 | ||
1.7 What do I need to know to be an effective user of statistics? | 12 | ||
1.8 A few words about SPSS | 14 | ||
1.9 Quick guide to the book’s procedures and statistical tests | 14 | ||
Key points | 17 | ||
Computer analysis: SPSS Analyze Graphs and Transform drop-down menus | 18 | ||
Part 1 Descriptive statistics | 21 | ||
2 Some basics: Variability and measurement | 23 | ||
Overview | 23 | ||
2.1 Introduction\r | 24 | ||
2.2 Variables and measurement | 25 | ||
2.3 Major types of measurement | 26 | ||
Key points | 30 | ||
Computer analysis: Some basics of data entry using SPSS | 31 | ||
3 Describing variables: Tables and diagrams | 33 | ||
Overview | 33 | ||
3.1 Introduction | 34 | ||
3.2 Choosing tables and diagrams | 35 | ||
3.3 Errors to avoid | 43 | ||
Key points | 44 | ||
Computer analysis: Tables, diagrams and recoding using SPSS | 45 | ||
4 Describing variables numerically: Averages, variation and spread | 48 | ||
Overview | 48 | ||
4.1 Introduction | 49 | ||
4.2 Typical scores: mean, median and mode | 50 | ||
4.3 Comparison of mean, median and mode | 53 | ||
4.4 Spread of scores: range and interquartile range | 53 | ||
4.5 Spread of scores: variance | 56 | ||
Key points | 61 | ||
Computer analysis: Descriptive statistics using SPSS | 62 | ||
5 Shapes of distributions of scores | 64 | ||
Overview | 64 | ||
5.1 Introduction | 65 | ||
5.2 Histograms and frequency curves | 65 | ||
5.3 Normal curve | 66 | ||
5.4 Distorted curves | 68 | ||
5.5 Other frequency curves | 70 | ||
Key points | 75 | ||
Computer analysis: Frequencies using SPSS | 75 | ||
6 Standard deviation and z-scores: Standard unit of measurement in statistics | 77 | ||
Overview | 77 | ||
6.1 Introduction | 78 | ||
6.2 Theoretical background | 78 | ||
6.3 Measuring the number of standard deviations – the z-score | 82 | ||
6.4 Use of z-scores | 84 | ||
6.5 Standard normal distribution | 85 | ||
6.6 Important feature of z-scores | 88 | ||
Key points | 90 | ||
Computer analysis: Standard deviation and z-scores using SPSS | 90 | ||
7 Relationships between two or more variables: Diagrams and tables | 93 | ||
Overview | 93 | ||
7.1 Introduction | 94 | ||
7.2 Principles of diagrammatic and tabular presentation | 95 | ||
7.3 Type A: both variables numerical scores | 96 | ||
7.4 Type B: both variables nominal categories | 98 | ||
7.5 Type C: one variable nominal categories, the other numerical scores | 100 | ||
Key points | 102 | ||
Computer analysis: Crosstabulation and compound bar charts using SPSS | 103 | ||
8 Correlation coefficients: Pearson’s correlation and Spearman’s rho | 105 | ||
Overview | 105 | ||
8.1 Introduction | 106 | ||
8.2 Principles of the correlation coefficient | 107 | ||
8.3 Some rules to check out | 114 | ||
8.4 Coefficient of determination | 115 | ||
8.5 Significance testing | 116 | ||
8.6 Spearman’s rho – another correlation coefficient | 116 | ||
8.7 Example from the literature | 119 | ||
Key points | 121 | ||
Computer analysis: Correlation coefficients using SPSS | 122 | ||
Computer analysis: Scattergram using SPSS | 124 | ||
9 Regression: Prediction with precision | 126 | ||
Overview | 126 | ||
9.1 Introduction | 127 | ||
9.2 Theoretical background and regression equations | 129 | ||
9.3 Confidence intervals and standard error: how accurate are the predicted score and the regression equations? | 134 | ||
Key points | 137 | ||
Computer analysis: Simple regression using SPSS | 137 | ||
Part 2 Significance testing\r | 141 | ||
10 Samples from populations | 143 | ||
Overview | 143 | ||
10.1 Introduction | 144 | ||
10.2 Theoretical considerations | 144 | ||
10.3 Characteristics of random samples | 146 | ||
10.4 Confidence intervals | 147 | ||
Key points | 148 | ||
Computer analysis: Selecting a random sample using SPSS | 148 | ||
11 Statistical significance for the correlation coefficient: Practical introduction to statistical inference | 150 | ||
Overview | 150 | ||
11.1 Introduction | 151 | ||
11.2 Theoretical considerations | 151 | ||
11.3 Back to the real world: null hypothesis | 153 | ||
11.4 Pearson’s correlation coefficient again | 155 | ||
11.5 Spearman’s rho correlation coefficient | 159 | ||
Key points | 161 | ||
Computer analysis: Correlation coefficients using SPSS | 162 | ||
12 Standard error: Standard deviation of the means of samples | 164 | ||
Overview | 164 | ||
12.1 Introduction | 165 | ||
12.2 Theoretical considerations | 165 | ||
12.3 Estimated standard deviation and standard error | 167 | ||
Key points | 169 | ||
Computer analysis: Standard error using SPSS | 170 | ||
13 Related t-test: Comparing two samples of related/correlated/paired scores | 172 | ||
Overview | 172 | ||
13.1 Introduction | 173 | ||
13.2 Dependent and independent variables | 175 | ||
13.3 Some basic revision | 175 | ||
13.4 Theoretical considerations underlying the computer analysis | 176 | ||
13.5 Cautionary note | 181 | ||
Key points | 183 | ||
Computer analysis: Related/correlated/paired t-test using SPSS | 184 | ||
14 Unrelated t-test: Comparing two samples of unrelated/uncorrelated/independent scores | 186 | ||
Overview | 186 | ||
14.1 Introduction | 187 | ||
14.2 Theoretical considerations | 188 | ||
14.3 Standard deviation and standard error | 193 | ||
14.4 Cautionary note | 199 | ||
Key points | 200 | ||
Computer analysis: Unrelated/uncorrelated/independent t-test using SPSS | 201 | ||
15 What you need to write about your statistical analysis\r | 203 | ||
Overview | 203 | ||
15.1 Introduction | 204 | ||
15.2 Reporting statistical significance | 205 | ||
15.3 Shortened forms | 205 | ||
15.4 APA (American Psychological Association) style | 206 | ||
Key points | 209 | ||
16 Confidence intervals | 210 | ||
Overview | 210 | ||
16.1 Introduction | 211 | ||
16.2 Relationship between significance and confidence intervals | 213 | ||
16.3 Regression | 217 | ||
16.4 Writing up a confidence interval using APA style | 219 | ||
16.5 Other confidence intervals | 219 | ||
Key points | 220 | ||
Computer analysis: Examples of SPSS output containing confidence intervals | 220 | ||
17 Effect size in statistical analysis: Do my findings matter? | 221 | ||
Overview | 221 | ||
17.1 Introduction | 222 | ||
17.2 Statistical significance and effect size | 222 | ||
17.3 Size of the effect in studies | 223 | ||
17.4 Approximation for nonparametric tests | 225 | ||
17.5 Analysis of variance (ANOVA) | 225 | ||
17.6 Writing up effect sizes using APA style | 227 | ||
17.7 Have I got a large, medium or small effect size? | 227 | ||
17.8 Method and statistical efficiency | 228 | ||
Key points | 230 | ||
18 Chi-square: Differences between samples of frequency data | 231 | ||
Overview | 231 | ||
18.1 Introduction | 232 | ||
18.2 Theoretical issues | 233 | ||
18.3 Partitioning chi-square | 239 | ||
18.4 Important warnings | 240 | ||
18.5 Alternatives to chi-square | 241 | ||
18.6 Chi-square and known populations | 243 | ||
18.7 Chi-square for related samples – the McNemar test | 245 | ||
18.8 Example from the literature | 245 | ||
Key points | 247 | ||
Computer analysis: Chi-square using SPSS | 248 | ||
Recommended further reading | 250 | ||
19 Probability | 251 | ||
Overview | 251 | ||
19.1 Introduction | 252 | ||
19.2 Principles of probability | 252 | ||
19.3 Implications | 254 | ||
Key points | 256 | ||
20 One-tailed versus two-tailed significance testing | 257 | ||
Overview | 257 | ||
20.1 Introduction | 258 | ||
20.2 Theoretical considerations | 258 | ||
20.3 Further requirements | 260 | ||
Key points | 261 | ||
Computer analysis: One- and two-tailed statistical significance using SPSS | 262 | ||
21 Ranking tests: Nonparametric statistics | 263 | ||
Overview | 263 | ||
21.1 Introduction | 264 | ||
21.2 Theoretical considerations | 264 | ||
21.3 Nonparametric statistical tests | 266 | ||
21.4 Three or more groups of scores | 274 | ||
Key points | 275 | ||
Computer analysis: Two-group ranking tests using SPSS | 275 | ||
Recommended further reading | 277 | ||
Part 3 Introduction to analysis of variance | 279 | ||
22 Variance ratio test: F-ratio to compare two variances | 281 | ||
Overview | 281 | ||
22.1 Introduction | 282 | ||
22.2 Theoretical issues and application | 283 | ||
Key points | 287 | ||
Computer analysis: F-ratio test using SPSS | 288 | ||
23 Analysis of variance (ANOVA): One-way unrelated or uncorrelated ANOVA | 290 | ||
Overview | 290 | ||
23.1 Introduction | 291 | ||
23.2 Some revision and some new material | 292 | ||
23.3 Theoretical considerations | 292 | ||
23.4 Degrees of freedom | 296 | ||
23.5 Analysis of variance summary table | 302 | ||
Key points | 305 | ||
Computer analysis: Unrelated one-way analysis of variance using SPSS | 306 | ||
24 ANOVA for correlated scores or repeated measures | 308 | ||
Overview | 308 | ||
24.1 Introduction | 309 | ||
24.2 Theoretical considerations underlying the computer analysis | 311 | ||
24.3 Examples | 312 | ||
Key points | 321 | ||
Computer analysis: Related analysis of variance using SPSS | 322 | ||
25 Two-way or factorial ANOVA for unrelated/uncorrelated scores: Two studies for the price of one? | 324 | ||
Overview | 324 | ||
25.1 Introduction | 325 | ||
25.2 Theoretical considerations | 326 | ||
25.3 Steps in the analysis | 327 | ||
25.4 More on interactions | 340 | ||
25.5 Three or more independent variables | 343 | ||
Key points | 347 | ||
Computer analysis: Unrelated two-way analysis of variance using SPSS | 348 | ||
26 Multiple comparisons with in ANOVA: A priori and post hoc tests | 351 | ||
Overview | 351 | ||
26.1 Introduction | 352 | ||
26.2 Planned (a priori) versus unplanned (post hoc) comparisons | 353 | ||
26.3 Methods of multiple comparisons testing | 354 | ||
26.4 Multiple comparisons for multifactorial ANOVA | 354 | ||
26.5 Contrasts | 355 | ||
26.6 Trends | 357 | ||
Key points | 358 | ||
Computer analysis: Multiple comparison tests using SPSS | 359 | ||
Recommended further reading | 361 | ||
27 Mixed-design ANOVA: Related and unrelated variables together | 362 | ||
Overview | 362 | ||
27.1 Introduction | 363 | ||
27.2 Mixed designs and repeated measures | 363 | ||
Key points | 376 | ||
Computer analysis: Mixed design analysis of variance using SPSS | 376 | ||
Recommended further reading | 378 | ||
28 Analysis of covariance (ANCOVA): Controlling for additional variables | 379 | ||
Overview | 379 | ||
28.1 Introduction | 380 | ||
28.2 Analysis of covariance | 381 | ||
Key points | 391 | ||
Computer analysis: Analysis of covariance using SPSS | 392 | ||
Recommended further reading | 394 | ||
29 Multivariate analysis of variance (MANOVA) | 395 | ||
Overview | 395 | ||
29.1 Introduction | 396 | ||
29.2 MANOVA’s two stages | 399 | ||
29.3 Doing MANOVA | 401 | ||
29.4 Reporting your findings | 406 | ||
Key points | 407 | ||
Computer analysis: Multivariate analysis of variance using SPSS | 408 | ||
Recommended further reading | 410 | ||
30 Discriminant (function) analysis – especially in MANOVA | 411 | ||
Overview | 411 | ||
30.1 Introduction | 412 | ||
30.2 Doing the discriminant function analysis | 414 | ||
30.3 Reporting your findings | 420 | ||
Key points | 421 | ||
Computer analysis: Discriminant function analysis using SPSS | 422 | ||
Recommended further reading | 424 | ||
31 Statistics and analysis of experiments | 425 | ||
Overview | 425 | ||
31.1 Introduction | 426 | ||
31.2 The Patent Stats Pack | 426 | ||
31.3 Checklist | 427 | ||
31.4 Special cases | 431 | ||
Key points | 431 | ||
Computer analysis: Selecting subsamples of your data using SPSS | 433 | ||
Computer analysis: Recoding groups for multiple comparison tests using SPSS | 435 | ||
Part 4 More advanced correlational statistics | 437 | ||
32 Partial correlation: Spurious correlation, third or confounding variables, suppressor variables | 439 | ||
Overview | 439 | ||
32.1 Introduction | 440 | ||
32.2 Theoretical considerations | 441 | ||
32.3 Doing partial correlation | 443 | ||
32.4 Interpretation | 444 | ||
32.5 Multiple control variables | 445 | ||
32.6 Suppressor variables | 445 | ||
32.7 Example from the research literature | 446 | ||
32.8 Example from a student’s work | 447 | ||
Key points | 448 | ||
Computer analysis: Partial correlation using SPSS | 449 | ||
33 Factor analysis: Simplifying complex data | 451 | ||
Overview | 451 | ||
33.1 Introduction | 452 | ||
33.2 A bit of history | 453 | ||
33.3 Concepts in factor analysis | 454 | ||
33.4 Decisions, decisions, decisions | 456 | ||
33.5 Exploratory and confirmatory factor analysis | 464 | ||
33.6 Example of factor analysis from the literature | 466 | ||
33.7 Reporting the results | 468 | ||
Key points | 470 | ||
Computer analysis: Principal components analysis using SPSS | 471 | ||
Recommended further reading | 473 | ||
34 Multiple regression and multiple correlation | 474 | ||
Overview | 474 | ||
34.1 Introduction | 475 | ||
34.2 Theoretical considerations | 476 | ||
34.3 Assumptions of multiple regression | 481 | ||
34.4 Stepwise multiple regression example | 482 | ||
34.5 Reporting the results | 485 | ||
34.6 Example from the published literature | 486 | ||
Key points | 488 | ||
Computer analysis: Stepwise multiple regression using SPSS | 489 | ||
Recommended further reading | 491 | ||
35 Path analysis | 492 | ||
Overview | 492 | ||
35.1 Introduction | 493 | ||
35.2 Theoretical considerations | 493 | ||
35.3 Example from published research | 500 | ||
35.4 Reporting the results | 503 | ||
Key points | 504 | ||
Computer analysis: Hierarchical multiple regression using SPSS | 505 | ||
Recommended further reading | 507 | ||
36 Analysis of a questionnaire/survey project | 508 | ||
Overview | 508 | ||
36.1 Introduction | 509 | ||
36.2 Research project | 509 | ||
36.3 Research hypothesis | 511 | ||
36.4 Initial variable classification | 512 | ||
36.5 Further coding of data | 513 | ||
36.6 Data cleaning | 514 | ||
36.7 Data analysis | 514 | ||
Key points | 516 | ||
Computer analysis: Adding and averaging components of a measure using SPSS | 516 | ||
Part 5 Assorted advanced techniques\r | 519 | ||
37 Meta-analysis: Combining and exploring statistical findings from previous research | 521 | ||
Overview | 521 | ||
37.1 Introduction | 522 | ||
37.2 Pearson correlation coefficient as the effect size | 524 | ||
37.3 Other measures of effect size | 524 | ||
37.4 Effects of different characteristics of studies | 525 | ||
37.5 First steps in meta-analysis | 526 | ||
37.6 Illustrative example | 532 | ||
37.7 Comparing a study with a previous study | 535 | ||
37.8 Reporting the results | 536 | ||
Key points | 538 | ||
Computer analysis: Some meta-analysis software | 538 | ||
Recommended further reading | 539 | ||
38 Reliability in scales and measurement: Consistency and agreement | 540 | ||
Overview | 540 | ||
38.1 Introduction | 541 | ||
38.2 Item-analysis using item–total correlation | 541 | ||
38.3 Split-half reliability | 543 | ||
38.4 Alpha reliability | 544 | ||
38.5 Agreement among raters | 547 | ||
Key points | 551 | ||
Computer analysis: Cronbach’s alpha and kappa using SPSS | 552 | ||
Recommended further reading | 553 | ||
39 Influence of moderator variables on relationships between two variables | 554 | ||
Overview | 554 | ||
39.1 Introduction | 555 | ||
39.2 Statistical approaches to finding moderator effects | 559 | ||
39.3 Hierarchical multiple regression approach to identifying moderator effects (or interactions) | 559 | ||
39.4 ANOVA approach to identifying moderator effects (i.e. interactions) | 569 | ||
Key points | 573 | ||
Computer analysis: Regression moderator analysis using SPSS | 574 | ||
Recommended further reading | 575 | ||
40 Statistical power analysis: Getting the sample size right | 576 | ||
Overview | 576 | ||
40.1 Introduction | 577 | ||
40.2 Types of statistical power analysis and their limitations | 587 | ||
40.3 Doing power analysis | 589 | ||
40.4 Calculating power | 591 | ||
40.5 Reporting the results | 595 | ||
Key points | 596 | ||
Computer analysis: Power analysis with G*Power | 597 | ||
Part 6 Advanced qualitative or nominal techniques | 601 | ||
41 Log-linear methods: Analysis of complex contingency tables | 603 | ||
Overview | 603 | ||
41.1 Introduction | 604 | ||
41.2 Two-variable example | 606 | ||
41.3 Three-variable example | 613 | ||
41.4 Reporting the results | 624 | ||
Key points | 625 | ||
Computer analysis: Log-linear analysis using SPSS | 626 | ||
Recommended further reading | 627 | ||
42 Multinomial logistic regression: Distinguishing between several different categories or groups | 628 | ||
Overview | 628 | ||
42.1 Introduction | 629 | ||
42.2 Dummy variables | 631 | ||
42.3 What can multinomial logistic regression do? | 632 | ||
42.4 Worked example | 634 | ||
42.5 Accuracy of the prediction | 635 | ||
42.6 How good are the predictors? | 636 | ||
42.7 Prediction | 639 | ||
42.8 Interpreting the results | 641 | ||
42.9 Reporting the results | 641 | ||
Key points | 643 | ||
Computer analysis: Multinomial logistic regression using SPSS | 644 | ||
43 Binomial logistic regression | 646 | ||
Overview | 646 | ||
43.1 Introduction | 647 | ||
43.2 Typical example | 651 | ||
43.3 Applying the logistic regression procedure | 654 | ||
43.4 Regression formula | 658 | ||
43.5 Reporting the results | 659 | ||
Key points | 660 | ||
Computer analysis: Binomial logistic regression using SPSS | 661 | ||
Appendices\r | 663 | ||
Appendix A Testing for excessively skewed distributions | 663 | ||
Appendix B1 Large-sample formulae for the nonparametric tests | 666 | ||
Appendix B2 Nonparametric tests for three or more groups | 668 | ||
Computer analysis: Kruskal–Wallis and Friedman nonparametric tests using SPSS | 672 | ||
Appendix C Extended table of significance for the Pearson correlation coefficient | 674 | ||
Appendix D Table of significance for the Spearman correlation coefficient | 677 | ||
Appendix E Extended table of significance for the t-test | 680 | ||
Appendix F Table of significance for chi-square | 683 | ||
Appendix G Extended table of significance for the sign test | 684 | ||
Appendix H Table of significance for the Wilcoxon matched pairs test | 687 | ||
Appendix I Tables of significance for the Mann–Whitney U-test | 690 | ||
Appendix J Tables of significance values for the F-distribution | 693 | ||
Appendix K Table of significance values for t when making multiple t-tests | 696 | ||
Glossary | 699 | ||
References | 707 | ||
Index | 713 |