MT/s vs MHz – what’s the difference?

MT/s (MegaTransfers per second) and MHz (MegaHertz) are both units of measurement used to describe data transfer rates and frequencies, respectively, but they represent different aspects of electronic systems.

Megahertz (MHz)

This unit measures the frequency of a signal, specifically in multiples of one million cycles per second (or Hertz). In numbers this is 1,000,000 = 1*106 cycles every second. For example, 5 MHz is equivalent to 5,000,000 Hz.

In the context of processors or oscillators, MHz refers to the clock speed, indicating how many cycles a processor can perform in a second. It’s a measure of the operational frequency of the device, giving an idea of how many operations per second the device can perform.

Essentially the higher the MHz value, the faster the processor.

MegaTransfers per second (MT/s)

This unit measures the number of data transfers or operations that can occur in one second, specifically one million transfers per second.

It is commonly used to describe the data transfer rate in memory (RAM) and storage interfaces. For instance, DDR (Double Data Rate) memory can transfer data twice per clock cycle, making its transfer rate in MT/s double its clock frequency in MHz. This means if a DDR memory module has a clock frequency of 1600 MHz, its data transfer rate is 3200 MT/s.

Reference: QDR vs DDR vs SDR

👉 Take this calculation one step further to calculate the number of bits transferred every second.


MHz refers to the number of cycles per second, a measure of frequency, whereas MT/s refers to the number of data transfer operations that can occur in a second.

When comparing the two, it’s important to remember that MT/s is often used in contexts where the number of operations per cycle can exceed one, such as in Double Data Rate memory, where the effective rate of data transfers is double the clock speed in MHz.