Date of this Version
The dependence of superlattice thermal conductivity on period length is investigated by molecular dynamics simulation. For perfectly lattice matched superlattices, a minimum is observed when the period length is of the order of the effective phonon mean free path. As temperature decreases and interatomic potential strength increases, the position of the minimum shifts to larger period lengths. The depth of the minimum is strongly enhanced as mass and interatomic potential ratios of the constituent materials increase. The simulation results are consistent with phonon transmission coefficient calculations, which indicate increased stop bandwidth and thus strongly enhanced Bragg scattering for the same conditions under which strong reductions in thermal conductivity are found. When nonideal interfaces are created by introducing a 4% lattice mismatch, the minimum disappears and thermal conductivity increases monotonically with period length. This result may explain why minimum thermal conductivity has not been observed in a large number of experimental studies.
Date Posted: 27 February 2006
This document has been peer reviewed.