When is a sort not a sort...
I've been expanding on the classical image processing in Damocles lately. Specifically the Rank Operator. It works like this, take all the pixel values around a central pixel, sort them by value, then replace the central pixel with the minimum / maximum / median / r th pixel from the sorted list.
Now, it seems obvious that you don't need to actually sort data to find the min or max value from a list. Finding the min/max of a list is an O(n) operation. Any other position in the list, however, is usually calculated by doing a full sort, then returning the rth ranked item from the list. A quick glance at this shows that you are performing an O(n log(n)) or O(n^2) sort in order to generate one number, then throwing away most of the data. Yet, this is how most software performs median filtering. It's really no problem when your neighbourhood is small (3x3 cross or box) but as your kernel size grows, the sort becomes the biggest concern.
So how do we avoid it? By using a slightly different definition of median. Traditional processing says that for any N items, the median value is the N/2th value after sorting. We can also use the following definition: the median value is the value that has the same number of values greater than it as smaller than it. Formally,
Median when either
1) smaller=larger
2) smaller < larger AND smaller+equal>=larger
The second test allows for duplicate values in the list and for even values of N. This is a generalization of the min test or max test. The min test is smaller==0 while the max test is larger==0.
You can generalize this to any position in the list as follows:
(zero based) r'th in list when either
1) smaller=r
2) smaller<r AND smaller+equal>=r
In this manner, we only need to keep track of how many items are smaller and how many are equal.
Also, every time we test a value, we can use the results to fine tune the next iteration. For example, if a value of, say, 63 is determined to be too small to be the median, we can ignore any values in the list <=63. Likewise for values that are too big. By doing this, you progressively narrow the allowable range and avoid testing many values in the list.
Finally, when largest_possible==smallest_possible, you don't have to do any more testing at all, just return smallest_possible.
A 3x3 rank is usually done with a modified bubble sort (for simplicity, code size, and partial sorting capability) that stops after r iterations in the outer loop. This routine compare well with this.
This code performs well on random data usually beating O(n log(n)) (quicksort et al). On identical data, it is O(1). The pathological worst case is when the numbers are sorted into max, min, max-1, min-1, max-2, min-2, etc. In this case, the operation performs O(n^2) or O(n*range(type)) whichever is smaller. This is not a naturally occuring pattern in image data, however. On large lists, the range(type) clamps the log(n) term to log(range(type)) so performing extremely large ranking operations becomes possible with no additional computation (free!)
Oscar
Now, it seems obvious that you don't need to actually sort data to find the min or max value from a list. Finding the min/max of a list is an O(n) operation. Any other position in the list, however, is usually calculated by doing a full sort, then returning the rth ranked item from the list. A quick glance at this shows that you are performing an O(n log(n)) or O(n^2) sort in order to generate one number, then throwing away most of the data. Yet, this is how most software performs median filtering. It's really no problem when your neighbourhood is small (3x3 cross or box) but as your kernel size grows, the sort becomes the biggest concern.
So how do we avoid it? By using a slightly different definition of median. Traditional processing says that for any N items, the median value is the N/2th value after sorting. We can also use the following definition: the median value is the value that has the same number of values greater than it as smaller than it. Formally,
Median when either
1) smaller=larger
2) smaller < larger AND smaller+equal>=larger
The second test allows for duplicate values in the list and for even values of N. This is a generalization of the min test or max test. The min test is smaller==0 while the max test is larger==0.
You can generalize this to any position in the list as follows:
(zero based) r'th in list when either
1) smaller=r
2) smaller<r AND smaller+equal>=r
In this manner, we only need to keep track of how many items are smaller and how many are equal.
Also, every time we test a value, we can use the results to fine tune the next iteration. For example, if a value of, say, 63 is determined to be too small to be the median, we can ignore any values in the list <=63. Likewise for values that are too big. By doing this, you progressively narrow the allowable range and avoid testing many values in the list.
Finally, when largest_possible==smallest_possible, you don't have to do any more testing at all, just return smallest_possible.
// ----------------------------------------------
// Function: Rank
// Parameters:
// list: an array of (type)
// length: number of entries in list
// rank: rank within list [0..length-1]
// (floating point data needs code changes)
// ----------------------------------------------
(type) Rank((type) *list,int length,int rank) {
(type) temp;
(type) smallest=Smallest(type), largest=Largest(type);
int i,j,lower,same;
for (i=0;i<length;i++) {
temp=list[i];
if ((temp<=largest) && (temp>=smallest)) {
for (lower=0,same=0,j=0;j<length;j++) {
if (list[j]<temp) lower++;
else if (list[j]==temp) same++;
}
if (lower==rank) return temp;
else if (lower>rank) {
if (temp<=largest) {
largest=temp-1;
if (smallest==largest) return smallest;
}
} else {
if (lower+same>rank) return temp;
if (temp>=smallest) {
smallest=temp+1;
if (smallest==largest) return smallest;
}
}
}
}
return (smallest+largest)/2;
}
A 3x3 rank is usually done with a modified bubble sort (for simplicity, code size, and partial sorting capability) that stops after r iterations in the outer loop. This routine compare well with this.
This code performs well on random data usually beating O(n log(n)) (quicksort et al). On identical data, it is O(1). The pathological worst case is when the numbers are sorted into max, min, max-1, min-1, max-2, min-2, etc. In this case, the operation performs O(n^2) or O(n*range(type)) whichever is smaller. This is not a naturally occuring pattern in image data, however. On large lists, the range(type) clamps the log(n) term to log(range(type)) so performing extremely large ranking operations becomes possible with no additional computation (free!)
Oscar
2 Comments:
OMG, is this what you were talking to your wife about? Thank goodness for blogging.
By Dee, at 10:33 PM
It has saved my marriage!
lol
Oscar
By Oscar Papel, at 12:12 AM
Post a Comment
<< Home