Stride of an array

From Wikipedia, the free encyclopedia

In computer programming, the stride of an array (also increment or step size) refers to the number of locations in memory between successive array elements, measured in bytes or in units of the size of the array's elements.

An array with stride 1 has elements which are contiguous in memory. Such arrays are sometimes said to have unit stride. Unit stride arrays are generally more efficient than non-unit stride arrays, due to the effects of caching.

[edit] Reasons for non-unit stride

There are at least two reasons arrays may have a stride larger than their elements' width in bytes. First, many languages (including C and C++) allow structures to be padded to better take advantage of the word length of the machine. For example:

struct ThreeBytesWide {
    char a[3];
};

struct ThreeBytesWide myArray[100];

In the above code snippet, myArray might well turn out to have a stride of four bytes, rather than three, if the C code were compiled for a 32-bit architecture.

Second, some languages allow arrays of structures to be treated as overlapping parallel arrays with non-unit stride:

struct MyRecord {
    int value;
    char *text;
};

/* Print the contents of an array of ints with the given stride */
void print_some_ints(const int *arr, int length, size_t stride)
{
    int i;
    for (i=0; i < length; ++i) {
        printf("%d\n", arr[0]);
        arr = (int *)((unsigned char *)arr + stride);
    }
}

int main(void)
{
    int ints[100] = {0};
    struct MyRecord records[100] = {0};

    print_some_ints(&ints[0], 100, sizeof ints[0]);
    print_some_ints(&records[0].value, 100, sizeof records[0]);
    return 0;
}

This idiom is a form of type punning.