This dissertation describes the foundation for maintaining TIMSS’ 20 year trend measurements with the introduction of a new computer- and tablet-based mode of assessment delivery—eTIMSS. Because of the potential for mode effects on the psychometric behavior of the trend items that TIMSS relies on to maintain comparable scores between subsequent assessment cycles, development efforts for TIMSS 2019 began over three years in advance. This dissertation documents the development of eTIMSS over this period and features the methodology and results of the eTIMSS Pilot / Item Equivalence Study. The study was conducted in 25 countries and employed a within-subjects, counterbalanced design to determine the effect of the mode of administration on the trend items. Further analysis examined score-level mode effects in relation to students’ socioeconomic status, gender, and self-efficacy for using digital devices. Strategies are discussed for mitigating threats of construct irrelevant variance on students’ eTIMSS performance. The analysis by student subgroups, similar item discriminations, high cross-mode correlations, and equivalent rankings of country means provide support for the equivalence of the mathematics and science constructs between paperTIMSS and eTIMSS. However, the results revealed an overall mode effect on the TIMSS trend items, where items were more difficult for students in digital formats compared to paper. The effect was larger in mathematics than science. An approach is needed to account for the mode effects in maintaining trend measurements from previous cycles to TIMSS 2019. Each eTIMSS 2019 trend country will administer the paper trend booklets to an additional nationally representative bridge sample of students, and a common population equating approach will ensure the link between paperTIMSS and eTIMSS scores.