41:
27:
746:
405:
615:
274:
604:
741:{\displaystyle {\begin{array}{|c|c||c|}\hline 124&74&32\\\hline 124&64&18\\\hline 157&116&84\\\hline \end{array}}\longrightarrow {\begin{array}{|c|c|c|}\hline 2&1&0\\\hline 2&x&0\\\hline 2&2&2\\\hline \end{array}}\longrightarrow 21020222}
400:{\displaystyle {\begin{array}{|c|c||c|}\hline 124&74&32\\\hline 124&64&18\\\hline 157&116&84\\\hline \end{array}}\longrightarrow {\begin{array}{|c|c|c|}\hline 1&1&0\\\hline 1&x&0\\\hline 1&1&1\\\hline \end{array}}\longrightarrow 11010111}
65:
image a binary string, encoding whether the pixel has smaller intensity than each of its neighbours, one for each bit. It is a non-parametric transform that depends only on relative ordering of intensities, and not on the actual values of intensity, making it invariant with respect to
259:
443:
414:. Several variations of the algorithm exist, using different size of the window, order of the neighbours in the pattern (row-wise, clockwise, counterclockwise), comparison operator (greater, greater or equal, lesser, lesser or equal).
151:
435:
599:{\displaystyle \xi (p,p')={\begin{cases}0&{\text{if }}p-p'>\epsilon \\1&{\text{if }}|p-p'|\leq \epsilon \\2&{\text{if }}p'-p>\epsilon \end{cases}}}
417:
An extension of the algorithm uses a three-way comparison that allows to represent similar pixels, whose intensity difference is smaller than a tolerance parameter
143:
123:
678:
620:
337:
279:
97:, that associates to each pixel the number of neighbouring pixels with higher intensity than the pixel itself, and was introduced in the same paper.
888:
264:
The results of these comparisons are concatenated and the value of the transform is an 8-bit value, that can be easily encoded in a
74:, and it behaves well in presence of multimodal distributions of intensity, e.g. along object boundaries. It has applications in
806:
410:
Similarity between images is determined by comparing the values of the census transform for corresponding pixels, using the
864:
609:
whose result can be encoded with two bits for each neighbour, thus doubling the size of the pattern for each pixel.
254:{\displaystyle \xi (p,p')={\begin{cases}0&{\text{if }}p>p'\\1&{\text{if }}p\leq p'\\\end{cases}}.}
478:
186:
835:
420:
79:
757:
71:
128:
8:
87:
105:
The most common version of the census transform uses a 3x3 window, comparing each pixel
108:
67:
851:
411:
843:
International
Conference on Scale Space and Variational Methods in Computer Vision
75:
882:
83:
40:
865:"Non-parametric local transforms for computing visual correspondence"
836:"Why is the census transform good for robust optic flow computation?"
62:
26:
852:"Efficient computation of optical flow using the census transform"
58:
592:
265:
244:
834:
Hafner, David; Demetz, Oliver; Weickert, Joachim (2013).
618:
446:
423:
277:
154:
131:
111:
833:
125:
with all its 8-connected neighbours with a function
740:
598:
429:
399:
253:
137:
117:
57:(CT) is an image operator that associates to each
46:Grayscale conversion followed by census transform
880:
862:
792:
790:
787:
93:The census transform is related to the
881:
872:European conference on computer vision
863:Zabih, Ramin; Woodfill, John (1994).
849:
807:"Census Transform Algorithm Overview"
78:, and it is commonly used in visual
889:Feature detection (computer vision)
856:Joint Pattern Recognition Symposium
13:
796:Zabih and Woodfill (1994), p. 153.
775:Zabih and Woodfill (1994), p. 152.
14:
900:
39:
25:
820:
799:
778:
769:
732:
674:
545:
526:
467:
450:
391:
333:
175:
158:
1:
763:
100:
7:
751:
19:Example of census transform
10:
905:
430:{\displaystyle \epsilon }
850:Stein, Fridtjof (2004).
80:correspondence problems
742:
600:
431:
401:
255:
139:
119:
784:Hafner et al. (2013).
758:Local binary patterns
743:
601:
432:
402:
256:
140:
120:
616:
444:
421:
275:
152:
138:{\displaystyle \xi }
129:
109:
874:. pp. 151–158.
845:. pp. 210–221.
738:
730:
672:
596:
591:
427:
397:
389:
331:
251:
243:
135:
115:
858:. pp. 79–86.
567:
523:
489:
225:
197:
118:{\displaystyle p}
32:A synthetic scene
896:
875:
869:
859:
846:
840:
827:
824:
818:
817:
815:
814:
803:
797:
794:
785:
782:
776:
773:
747:
745:
744:
739:
731:
673:
605:
603:
602:
597:
595:
594:
576:
568:
565:
548:
543:
529:
524:
521:
504:
490:
487:
466:
436:
434:
433:
428:
412:Hamming distance
406:
404:
403:
398:
390:
332:
260:
258:
257:
252:
247:
246:
240:
226:
223:
212:
198:
195:
174:
144:
142:
141:
136:
124:
122:
121:
116:
86:calculation and
55:census transform
43:
29:
904:
903:
899:
898:
897:
895:
894:
893:
879:
878:
867:
838:
830:
825:
821:
812:
810:
805:
804:
800:
795:
788:
783:
779:
774:
770:
766:
754:
729:
728:
723:
718:
712:
711:
706:
701:
695:
694:
689:
684:
677:
671:
670:
665:
660:
654:
653:
648:
643:
637:
636:
631:
626:
619:
617:
614:
613:
590:
589:
569:
564:
562:
556:
555:
544:
536:
525:
520:
518:
512:
511:
497:
486:
484:
474:
473:
459:
445:
442:
441:
422:
419:
418:
388:
387:
382:
377:
371:
370:
365:
360:
354:
353:
348:
343:
336:
330:
329:
324:
319:
313:
312:
307:
302:
296:
295:
290:
285:
278:
276:
273:
272:
242:
241:
233:
222:
220:
214:
213:
205:
194:
192:
182:
181:
167:
153:
150:
149:
130:
127:
126:
110:
107:
106:
103:
76:computer vision
51:
50:
49:
48:
47:
44:
35:
34:
33:
30:
21:
20:
12:
11:
5:
902:
892:
891:
877:
876:
860:
847:
829:
828:
819:
798:
786:
777:
767:
765:
762:
761:
760:
753:
750:
749:
748:
737:
734:
727:
724:
722:
719:
717:
714:
713:
710:
707:
705:
702:
700:
697:
696:
693:
690:
688:
685:
683:
680:
679:
676:
669:
666:
664:
661:
659:
656:
655:
652:
649:
647:
644:
642:
639:
638:
635:
632:
630:
627:
625:
622:
621:
607:
606:
593:
588:
585:
582:
579:
575:
572:
563:
561:
558:
557:
554:
551:
547:
542:
539:
535:
532:
528:
519:
517:
514:
513:
510:
507:
503:
500:
496:
493:
485:
483:
480:
479:
477:
472:
469:
465:
462:
458:
455:
452:
449:
426:
408:
407:
396:
393:
386:
383:
381:
378:
376:
373:
372:
369:
366:
364:
361:
359:
356:
355:
352:
349:
347:
344:
342:
339:
338:
335:
328:
325:
323:
320:
318:
315:
314:
311:
308:
306:
303:
301:
298:
297:
294:
291:
289:
286:
284:
281:
280:
262:
261:
250:
245:
239:
236:
232:
229:
221:
219:
216:
215:
211:
208:
204:
201:
193:
191:
188:
187:
185:
180:
177:
173:
170:
166:
163:
160:
157:
134:
114:
102:
99:
95:rank transform
70:variations of
45:
38:
37:
36:
31:
24:
23:
22:
18:
17:
16:
15:
9:
6:
4:
3:
2:
901:
890:
887:
886:
884:
873:
866:
861:
857:
853:
848:
844:
837:
832:
831:
826:Stein (2004).
823:
808:
802:
793:
791:
781:
772:
768:
759:
756:
755:
735:
725:
720:
715:
708:
703:
698:
691:
686:
681:
667:
662:
657:
650:
645:
640:
633:
628:
623:
612:
611:
610:
586:
583:
580:
577:
573:
570:
559:
552:
549:
540:
537:
533:
530:
515:
508:
505:
501:
498:
494:
491:
481:
475:
470:
463:
460:
456:
453:
447:
440:
439:
438:
437:, defined as
424:
415:
413:
394:
384:
379:
374:
367:
362:
357:
350:
345:
340:
326:
321:
316:
309:
304:
299:
292:
287:
282:
271:
270:
269:
267:
248:
237:
234:
230:
227:
217:
209:
206:
202:
199:
189:
183:
178:
171:
168:
164:
161:
155:
148:
147:
146:
132:
112:
98:
96:
91:
89:
85:
81:
77:
73:
69:
64:
60:
56:
42:
28:
871:
855:
842:
822:
811:. Retrieved
801:
780:
771:
608:
416:
409:
263:
104:
94:
92:
90:estimation.
84:optical flow
72:illumination
54:
52:
145:defined as
813:2019-06-05
764:References
733:⟶
675:⟶
587:ϵ
578:−
553:ϵ
550:≤
534:−
509:ϵ
495:−
448:ξ
425:ϵ
392:⟶
334:⟶
231:≤
156:ξ
133:ξ
101:Algorithm
88:disparity
68:monotonic
63:grayscale
883:Category
752:See also
736:21020222
574:′
566:if
541:′
522:if
502:′
488:if
464:′
395:11010111
238:′
224:if
210:′
196:if
172:′
82:such as
809:. Intel
868:(PDF)
839:(PDF)
61:of a
59:pixel
584:>
506:>
266:byte
203:>
53:The
663:116
658:157
641:124
624:124
322:116
317:157
300:124
283:124
885::
870:.
854:.
841:.
789:^
668:84
651:18
646:64
634:32
629:74
327:84
310:18
305:64
293:32
288:74
268:.
816:.
726:2
721:2
716:2
709:0
704:x
699:2
692:0
687:1
682:2
581:p
571:p
560:2
546:|
538:p
531:p
527:|
516:1
499:p
492:p
482:0
476:{
471:=
468:)
461:p
457:,
454:p
451:(
385:1
380:1
375:1
368:0
363:x
358:1
351:0
346:1
341:1
249:.
235:p
228:p
218:1
207:p
200:p
190:0
184:{
179:=
176:)
169:p
165:,
162:p
159:(
113:p
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.