The SARAFun project goes beyond traditional industrial assembly applications where uni-manual robotic setups are considered by employing dual arm human-sized robots to perform assembly tasks that are typically executed by humans, in dynamic environments originally designed for human use. In this talk I will present a dual arm robot control strategy for assembly that can easily accommodate geometric uncertainty typically arising in environments where humans and robots coexist. The framework incorporates coordinated task space frameworks, a simple 2-DoF articulated object model that can capture a folding assembly task and a kinesthetic perception module based on external force/torque measurements and proprioception. The latter is able to identify in real time the DoFs while estimating the position of the joints. Experiments with a "toy" articulated object as well its assembly analogous parts will be demonstrated to showcase how the dual arm control strategy has been used within the SARAFun project to successfully assemble components via folding.