Accurate and consistent statistical reporting is a central part of APA style. Your audience should be able to understand what you measured, how strong the results were, and how precise the estimates are without needing access to your original data.
The APA’s 7th edition provides specific rules for writing numbers, using decimal places, choosing symbols, and presenting major tests such as t-tests, F-tests, χ² tests, regression results, and confidence intervals.
This article explains the key rules step by step and provides practical, ready-to-use examples that you can adjust for your own academic writing.
In APA style, the primary purpose of the results section is to provide enough statistical information so that the findings can be understood and reproduced, without adding unnecessary detail.
This generally includes:
At the very least, for every inferential analysis, APA requires you to include:
APA style includes many rules about numbers, but when it comes to statistics and measurements, the main rule is straightforward: use numerals.
Use numerals for the following categories:
Write out numbers as words mainly in non-statistical sentences, for example:
At the start of a sentence: “Two additional checks were conducted…”
When describing a count as an idea rather than a specific measurement: “A response of one indicated strong disagreement.”
Moreover, avoid beginning a sentence with lengthy or complicated numbers by rephrasing the sentence:
Not preferred: “Twenty-seven participants had missing data.”
Improved: “Missing data occurred for 27 participants.”
Follow these rules for punctuation and plural forms:
Our writers are ready to deliver multiple custom topic suggestions straight to your email that aligns
with your requirements and preferences:
The goal is to present numbers with clarity and useful accuracy: round enough to keep the text clean, but not so much that key details disappear.
The following are the standard guidelines:
Give exact values down to p = .001 or .000, then switch to reporting p < .001.
Be sure to keep the same level of rounding within the same table, figure, or group of related numbers.
For instance, if one correlation is shown as r=.25, do not report another as r=.247 in that same table unless there is a specific justification.
APA style distinguishes between statistics that can exceed one and those that cannot.
When a statistic can be above 1 (such as means, SDs, t, F, χ², z), include a leading zero:
M=0.75
SD=0.62
t(28)=2.45
F(2,58)=3.17
When a statistic cannot exceed 1 (such as proportions, correlations, p values, and some effect sizes like r), do not use a leading zero:
p=.032
r=.46
Proportion correct =.84
This distinction is one of the clearest indicators that numerical reporting aligns with APA formatting expectations.
Not all research papers require displayed equations, but when they are included, APA guidelines require them to be presented clearly, easily readable, and consistently styled throughout the document. This helps readers follow your logic without confusion.
Use inline math for short and simple mathematical expressions that fit smoothly within a sentence:
“The mean difference was computed as X‾1 − X‾2.”
Use a displayed equation (centred on a separate line) for longer or more complicated formulas. These should be numbered only when the equation is mentioned more than once in the text, which allows readers to locate it easily:
t = (X‾1 − X‾2) / √(s1²/n1 + s2²/n2)
APA makes a clear distinction between how statistical terms should appear in regular sentences and how symbols should be written when paired with numerical values.
Means and standard deviations are fundamental descriptive statistics used in most APA-style research papers. The standard inline format is:
M = value,
Place these in parentheses immediately after the group or condition being described:
“Women (M = 3.66, SD = 0.40) reported higher happiness levels than men (M = 3.21, SD = 0.35).”[16]
“Reaction times in milliseconds (M = 535.4, Multiple groups and conditions
When comparing factor levels, name the factor first and then report the means and SDs for each level:
“Participants in the mindfulness condition reported lower stress (M = 2.41, SD = 0.62) than those in the control condition (M = 3.18, SD = 0.74).”
For studies with many groups or multiple outcomes, place descriptive statistics in a table and summarise the main patterns and contrasts in the text.
The chi-square test examines whether observed frequencies deviate from expected frequencies. APA style emphasises reporting the chi-square statistic (χ²), degrees of freedom, sample size, p-value, and effect size where applicable.
The standard presentation is:
χ²(df, N = sample size) = value,
“The relationship between gender and voting preference was significant, χ²(1, N = 120) = 4.36, p = .037.”
For chi-square analyses, typical effect sizes are phi (φ) for 2×2 tables and Cramer’s V for larger tables.
“There was a moderate association between experimental condition and response type, χ²(2, N = 210) = 12.54, p = .002, V = .24.”
z tests are less commonly reported explicitly because most software reports t tests, but when used, the pattern is simple:
z=value,
“Participants scored higher than the normative mean, z=2.47,p=.014.”
For t-tests, APA requires the t-value, degrees of freedom, p-value, and descriptive statistics for each group.
t(df)=value,
“Women (M=3.66,SD=0.40) reported significantly higher happiness than men (M=3.21,SD=0.35), t(98)=2.33,p=.022.”[17][16]
For t tests, report Cohen’s d or another effect size: “…, t(98)=2.33,p=.022,d=0.47.”
“A one‑sample t test indicated that United fans reported higher stress (M=83.00, SD=5.00) than the population norm of 80, t(48)=2.30,p=.026.”
ANOVA examines differences across three or more means. In APA style, report the F statistic, its degrees of freedom, the p-value, and an effect size, such as partial eta squared (η²).
Format:
F(dfbetween,dfwithin)=value,
Example:
“There was a significant effect of the year in college on stress scores, F(3,98) = 4.21, p = .008, η² = .11.”
Interpretation should indicate which groups differ (using post‑hoc comparisons) and the direction of differences:
“Post‑hoc Tukey tests showed that seniors reported higher stress than first‑year students, while differences between first‑ and second‑year students were not significant.”
Regression results include a large amount of numerical output, so presenting them in tables is usually the most effective approach. In the written text, APA guidelines suggest briefly showing the main findings:
Standard format:
R2=value,
Confidence intervals (CIs) indicate how accurate an estimate is and are considered a basic requirement in APA-style results, just like effect sizes. They give readers an idea of the range in which the true value is likely to fall and how stable your findings are.
APA style presents CIs using square brackets, and a comma separates the lower and upper limits:
95% CI [LL, UL]
This format helps readers instantly recognise the interval’s range.
“The mean stress score was 3.21 (SD = 0.54), 95% CI [3.10, 3.32].”
This means the researcher is 95% confident that the true stress score lies somewhere between 3.10 and 3.32.
Means
Mean differences
Regression coefficients
Effect sizes such as d, r, and p²
For every inferential test, APA suggests providing enough detail so that another researcher can follow your procedure and duplicate the analysis. This usually includes the test type, the relevant statistic, its degrees of freedom, the precise p-value, descriptive summaries for groups, and, when available, an effect size with a confidence interval.
Present non-significant findings using the same structure as significant ones, but clearly note that the effect was not statistically meaningful while giving the supporting values. For instance, a non-significant ANOVA or correlation should still list its statistic and p-value. Avoid broad claims like “no effect.”
Yes, unless the value is minimal. APA guidance states that researchers should provide precise p-values, except when they fall below .001, in which case p < .001 is reported. Writing p=.000 is incorrect because probabilities never equal zero. When using exact values, mention the significance level in your methods or the early results section.
You May Also Like