This work presents an FPGA implementation of a highly parallel architecture for the motion and disparity estimations of color images. Our system implements the well-known Lucas & Kanade algorithm with multi-scale extension for the computation of large displacements using color cues. We empirically fulfill the real-time requirements computing up to 32 and 36 frames per second for optical flow and disparity, respectively, with a 640 × 480 resolution. In this paper, we present our design technique based on fine pipelines, our architecture, and benchmarks of the different color-based alternatives analyzing the accuracy and resources utilization trade-off. We finally include some qualitative results and the resource utilization for our platform, concluding that we have obtained a system that manages a good trade-off between the increase in resources and the improvement in precision and the density of our results compared with other approaches described in the literature