Integration of Pixy2 Camera Sensor and Coordinate Transformation for Automatic Color-Based Implementation of a Pick-and-Place Arm Robot
DOI:
https://doi.org/10.26555/jiteki.v11i1.30717Keywords:
Arm Robot, Pick-and-Place, Object Position Detection, Object Color Recognition, Coordinate TransformationAbstract
Technology related to robotics has developed rapidly in recent years. In manufacturing production lines, an industrial pick-and-place robot is used to efficiently move objects from one location to another. In most approaches, this robot automates the repetitive task from one exact start position. However, the task of collecting objects from various positions in the robot workspace still introduces challenges in terms of object positional detection and movement accuracy. In this paper, an arm robot system equipped with automatic color-based object recognition and position control was proposed. The robot was able to detect multiple target object positions automatically without any need to plan a fixed movement beforehand. In the construction of the experiment platform, a Pixy2 camera sensor with color recognition ability was integrated into a 4-DoF Dobot Magician arm robot. Furthermore, a coordinate transformation was derived and implemented to achieve an accurate positional robot movement. The coordinate transformation performed a mapping from the Camera Coordinate System (CCS), which was initialized from image pixel values to the Robot Coordinate System (RCS), which was finalized to the robot’s actuator input signals. Prior to the implementation, the robot underwent a color calibration and position calibration. Thereafter, a set of color signatures was obtained and any object position in the camera’s field of view can be matched with any end-effector position in the robot’s workspace. Three experiment setups were conducted to evaluate the proposed system. Limited to one lighting condition, the robot was commanded to pick-and-place objects based on the criteria of all 3 colors, 1 specific color, and 2 specific colors. The robot performed perfectly to pick and place the objects, achieving a 100% success rate in terms of object color detection and pick-and-place. The positive results encouraged further investigation in different actuator actions and greater work areas.
References
[1] A. Alkamachi and Y. G. K. Abboosh, “Modelling and control of cable driven robotic arm using maplesim,” Advances in Electrical and Electronic Engineering, vol. 22, no. 3, 2024, https://doi.org/10.15598/aeee.v22i3.5685.
[2] S. Li et al., “An indoor autonomous inspection and firefighting robot based on SLAM and flame image recognition,” Fire, vol. 6, no. 3, Feb. 2023, https://doi.org/10.3390/fire6030093.
[3] Z. Wang, Y. Wu, and Q. Niu, “Multi-sensor fusion in automated driving: a survey,” IEEE Access, vol. 8, pp. 2847–2868, 2020, https://doi.org/ 10.1109/ACCESS.2019.2962554.
[4] S. M. Kesavan, K. S. Al Mamari, and N. S. M. Raja, "Solar powered robot for agricultural applications," in 2021 International Conference on System, Computation, Automation and Networking (ICSCAN), pp. 1-5, 2021, https://doi.org/ 10.1109/ICSCAN53069.2021.9526436.
[5] E. Sitompul, M. R. Gubarda, P. Sihombing, T. Simarmata and A. Turnip, "Sprayer System on Autonomous Farming Drone Based on Decision Tree," in 2024 IEEE International Conference on Artificial Intelligence and Mechatronics Systems (AIMS), pp. 1-6, 2024, https://doi.org/10.1109/AIMS61812.2024.10512852.
[6] H. Firdaus, R. Mardiati, and A. E. Setiawan, “Tomato harvesting robot prototype: Fuzzy-controlled arm with vision-based tomato detection,” in 2024 10th International Conference on Wireless and Telematics (ICWT), pp. 1–6, 2024, https://doi.org/10.1109/ICWT62080.2024.10674688.
[7] E. Sitompul, V. L. Setiawan, H. J. Tarigan, and M. Galina, “Image classification of fabric defects using ResNet50 deep transfer learning in FastAI,” Bulletin of Electrical Engineering and Informatics, vol. 13, no. 5, pp. 3255–3267, 2024, https://doi.org/10.11591/eei.v13i5.8218.
[8] J. Holland et al., “Service robots in the healthcare sector,” Robotics, vol. 10, no. 1, p. 47, 2021, https://doi.org/10.3390/robotics10010047.
[9] A. Perzylo et al., “SMErobotics: Smart Robots for Flexible Manufacturing,” IEEE Robotics & Automation Magazine, vol. 26, no. 1, pp. 78–90, 2019, https://doi.org/ 10.1109/mra.2018.2879747.
[10] X. Han, P. Liu, and J. Zhang, “A small payload desktop industry robot design without conventional reducers,” International Journal of Robotics and Automation (IJRA), vol. 13, no. 1, p. 31, 2024, https://doi.org/10.11591/ijra.v13i1.pp31-40.
[11] M.-L. Tseng, T. P. T. Tran, H. M. Ha, T.-D. Bui, and M. K. Lim, “Sustainable industrial and operation engineering trends and challenges Toward Industry 4.0: a data driven analysis,” Journal of Industrial and Production Engineering, vol. 38, no. 8, pp. 581–598, Nov. 2021, https://doi.org/10.1080/21681015.2021.1950227.
[12] C. Patruno, V. Renò, M. Nitti, N. Mosca, M. di Summa, and E. Stella, “Vision-based omnidirectional indoor robots for autonomous navigation and localization in manufacturing industry,” Heliyon, vol. 10, no. 4, p. e26042, 2024, https://doi.org/10.1016/j.heliyon.2024.e26042.
[13] A. M. Romanov, N. Gyrichidi, and M. P. Romanov, “A novel gripper with integrated rotary unit and force control for Pick and Place applications,” Robotics, vol. 11, no. 6, p. 155, 2022, https://doi.org/10.3390/robotics11060155.
[14] K. Komoda, S. Tokura, H. Eto, and A. Ogawa, “Grasping strategy to improve the grasping success rate for picking robots with suction and pinching mechanism,” in Proc. JSME Annu. Conf. Robot. Mechatron. (RoboMec), vol. 2021, no. 0, pp. 2A1-A06, 2021, https://doi.org/10.1299/jsmermd.2021.2a1-a06.
[15] F. Fahmizal et al., “Path planning for mobile robots on dynamic environmental obstacles using PSO optimization,” Jurnal Ilmiah Teknik Elektro Komputer dan Informatika (JITEKI), vol. 10, no. 1, pp. 166–172, 2024, https://doi.org/10.26555/jiteki.v10i1.28513.
[16] A. R. Al Tahtawi, M. Agni, and T. D. Hendrawati, “Small-scale robot arm design with pick and place mission based on inverse kinematics,” Journal of Robotics and Control (JRC), vol. 2, no. 6, 2021, https://doi.org/10.18196/jrc.26124.
[17] S. D. Han, S. W. Feng, and J. Yu, “Toward fast and optimal robotic pick-and-place on a moving conveyor,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 446–453, 2020, https://doi.org/ 10.1109/lra.2019.2961605.
[18] T. Anzai, M. Zhao, T. Nishio, F. Shi, K. Okada, and M. Inaba, “Fully autonomous brick pick and place in fields by articulated aerial robot: Results in various outdoor environments,” IEEE Robot. Autom. Mag., vol. 31, no. 2, pp. 39–53, 2024, https://doi.org/10.1109/mra.2023.3276265.
[19] J. Hu, Y. Niu, and Z. Wang, “Obstacle avoidance methods for rotor UAVs using RealSense camera,” in 2017 Chinese Automation Congress (CAC), pp. 7151–7155, Oct. 2017, https://doi.org/ 10.1109/CAC.2017.8244068.
[20] T. Ishiguro, H. Okuda, and T. Suzuki, “Proposal of model predictive trajectory planning method for autonomous parking considering obstacle avoidance constraint with coordinate transformation,” in Proc. JSME Annu. Conf. Robot. Mechatron. (RoboMec), vol. 2021, no. 0, pp. 1A1-E07, 2021, https://doi.org/10.1299/jsmermd.2021.1a1-e07.
[21] G. Soleti, P. R. Massenio, J. Kunze, and G. Rizzello, “Nonlinear coordinate transformation and trajectory tracking control of an underactuated soft robot driven by dielectric elastomers,” in 2024 IEEE 7th International Conference on Soft Robotics (RoboSoft), pp. 228-234, 2024, https://doi.org/10.1109/RoboSoft60065.2024.10521952.
[22] C. Tian, N. Hao, F. He, and H. Yao, “Consistent distributed cooperative localization: A coordinate transformation approach,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 10297–10303, 2024, https://doi.org/10.1109/IROS58592.2024.10802078.
[23] S.-H. Wu and X.-S. Hong, “Integrating computer vision and natural language instruction for collaborative robot human-robot interaction,” in 2020 International Automatic Control Conference (CACS), pp. 1–5, Nov. 2020, https://doi.org/10.1109/CACS50047.2020.9289768.
[24] C. D. Vo, D. A. Dang, and P. H. Le, “Development of multi-robotic arm system for sorting system using computer vision,” Journal of Robotics and Control (JRC), vol. 3, no. 5, pp. 690–698, 2022, https://doi.org/10.18196/jrc.v3i5.15661.
[25] M. N. S. Zainudin, S. B. A. Radzi, M. S. J. B. A. Razak, W. H. B. M. Saad, and M. H. B. A. Razak, “RGB-depth map formation from cili-padi plant imaging using stereo vision camera,” International Journal of Computer Vision and Robotics, vol. 1, no. 1, p. 1, 2022, https://doi.org/10.1504/ijcvr.2022.10049137.
[26] S. Qiu and M. R. Kermani, “Precision fingertip grasp: A human-inspired grasp planning and inverse kinematics approach for integrated arm–hand systems,” Robotics and Autonomous Systems, vol. 162, p. 104348, Apr. 2023, https://doi.org/10.1016/j.robot.2022.104348.
[27] O. Hock and J. Sedo, “Inverse kinematics using transposition method for robotic arm,” in 2018 ELEKTRO, pp. 1-5, 2018. https://doi.org/ 10.1109/ELEKTRO.2018.8398366.
[28] J. D. Sanjuan De Caro, M. Rahman, and I. Rulik, “Forward kinematic analysis of Dobot using closed-loop method,” IAES International Journal of Robotics and Automation (IJRA), vol. 9, no. 3, p. 153, 2020, https://doi.org/10.11591/ijra.v9i3.pp153-159.
[29] Z. Wang et al., “A magnetic soft robot with multimodal sensing capability by multimaterial direct ink writing,” Additive Manufacturing, vol. 61, p. 103320, Jan. 2023, https://doi.org/10.1016/j.addma.2022.103320.
[30] S. Sukamta, A. Nugroho, S. Subiyanto, R. Rezianto, M. F. Soambaton, and A. Ardiyanto, “Image-based position control for three-wheel Omni-directional robot,” Jurnal Ilmiah Teknik Elektro Komputer dan Informatika (JITEKI), vol. 10, no. 3, pp. 566–579, 2024, https://doi.org/10.26555/jiteki.v10i3.29601.
[31] P. Raykov, N. Valchkova, and R. Zahariev, “Analytical coordinate transformation for manipulation when using robots to serve people with disabilities,” in 2022 International Conference on Electrical, Computer and Energy Technologies (ICECET), pp. 1-6, 2022, https://doi.org/10.1109/ICECET55527.2022.9872603.
[32] Q. Guifang, S. U. N. Dalin, S. Guangming, W. E. N. Xiulan, W. E. I. Zhong, and S. Aiguo, “A rapid coordinate transformation method for serial robot calibration system,” Chinese Journal of Mechanical Engineering, vol. 56, no. 14, p. 1, 2020, https://doi.org/10.3901/jme.2020.14.001.
[33] S. Mondal, N. F. Sharon, K. M. Tabassum, U. H. Muna and N. Alam, "Development of a Low-Cost Real Time Color Detection Capable Robotic Arm," in 26th International Conference on Computer and Information Technology (ICCIT), pp. 1-6, 2023, https://doi.org/ 10.1109/ICCIT60459.2023.10441038.
[34] M. Alshihabi, M. Ozkahraman, and M. Y. Kayacan, “Enhancing the reliability of a robotic arm through lightweighting and vibration control with modal analysis and topology optimization,” Mechanics Based Design of Structures and Machines, pp. 1–25, 2024, https://doi.org/10.1080/15397734.2024.2400207.
[35] R. Siemasz, K. Tomczuk, and Z. Malecha, “3D printed robotic arm with elements of artificial intelligence,” Procedia Computer Science, vol. 176, pp. 3741–3750, 2020, https://doi.org/10.1016/j.procs.2020.09.013.
[36] H.-S. Kim and J.-B. Song, “Multi-DOF counterbalance mechanism for a service robot arm,” IEEE/ASME Transactions on Mechatronics, vol. 19, no. 6, pp. 1756–1763, Dec. 2014, https://doi.org/10.1109/tmech.2014.2308312.
[37] T. T. Tung, N. Van Tinh, D. T. Phuong Thao, and T. V. Minh, “Development of a prototype 6 degree of freedom robot arm,” Results in Engineering, vol. 18, Jun. 2023, https://doi.org/10.1016/j.rineng.2023.101049.
[38] B. R. M. Oldan et al., “Development of automatic solar-powered microcontroller- based algae collector using Pixy2 cam in taal lake,” in 2024 5th Technology Innovation Management and Engineering Science International Conference (TIMES-iCON), pp. 1–5, 2024, https://doi.org/10.1109/TIMES-iCON61890.2024.10630733.
[39] F. Morariu, T. Morariu, and S.-G. Racz, “Mobile robot vision navigation strategy based on Pixy2 camera,” Material Today: Proceedings, vol. 93, pp. 636–640, 2023, https://doi.org/10.1016/j.matpr.2023.04.327.
[40] S. D. Perkasa, P. Megantoro, and H. A. Winarno, “Implementation of a camera sensor pixy 2 CMUcam5 to A two wheeled robot to follow colored object,” Journal of Robotics and Control (JRC), vol. 2, no. 6, 2021, https://doi.org/10.18196/26128.
[41] M. Nihad Noaman, Z. Yousif Abdoon Al-Shibaany, and S. Al-Wais, “Omnidirectional robot indoor localisation using two pixy cameras and artificial colour code signature beacons,” in The 3rd International Conference on Computational Intelligence and Intelligent Systems, pp. 110-118, 2020, https://doi.org/10.1145/3440840.3440849.
[42] S. Saxena and S. G. Neogi, "A Framework for Insight Finder by Object Detection Mechanism," in 8th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), pp. 417-420, 2020, https://doi.org/10.1109/ICRITO48877.2020.9198009.
[43] T. D. Tran, X. Q. Ngo, V. T. Duong, H. Hung Nguyen, and T. T. Nguyen, “Predetermined path tracking of dedicated mobile robot using Pixy2 sensor: Application for fire extinguisher testing,” in International Conference on Electrical, Communication and Computer Engineering (ICECCE), pp. 1-6, 2023, https://doi.org/10.1109/ICECCE61019.2023.10442319.
[44] S. F. M. Putri, R. Mardiati, and A. E. Setiawan, “The prototype of arm robot for object mover using Arduino Mega 2560,” in 8th International Conference on Wireless and Telematics (ICWT), pp. 1–6, 2022, https://doi.org/10.1109/ICWT55831.2022.9935416.
[45] A. Mohammad and A. Hassan, “Forward and inverse kinematics of a 6-DOF robotic manipulator with a prismatic joint using MATLAB toolbox,” International Journal of Advanced Technology and Engineering Exploration, vol. 11, pp. 2394-7454, 2024, https://doi.org/10.19101/IJATEE.2024.111100210.
[46] D. Hroncová, Ľ. Miková, E. Prada, R. Rákay, P. Ján Sinčák and T. Merva, "Forward and inverse robot model kinematics and trajectory planning," 20th International Conference on Mechatronics - Mechatronika (ME), pp. 1-9, 2022, https://doi.org/10.1109/ME54704.2022.9983355.
[47] L. Roveda, P. Veerappan, M. Maccarini, G. Bucca, A. Ajoudani, and D. Piga, “A human-centric framework for robotic task learning and optimization,” Journal of Manufacturing Systems, vol. 67, pp. 68–79, Apr. 2023, https://doi.org/10.1016/j.jmsy.2023.01.003.
[48] J. Civera, L. Terissi, T. Pire, H. Gonzalez, and E. Santano, “An experimental evaluation of feature detectors and descriptors for visual SLAM,” International Journal of Computer Vision and Robotics, vol. 1, no. 1, p. 1, 2022, https://doi.org/10.1504/ijcvr.2022.10047492.
[49] F. Faizah, A. Triwiyatno and R. R. Isnanto, "Fuzzy Logic Implementation on Motion of Tennis Ball Picker Robot," in IEEE International Conference on Communication, Networks and Satellite (COMNETSAT), pp. 57-63, 2021, https://doi.org/10.1109/COMNETSAT53002.2021.9530815.
[50] E. Sitompul, R. M. Putra, H. Tarigan, A. Silitonga, and I. Bukhori, “Implementation of digital feedback control with change rate limiter in regulating water flow rate using Arduino,” Buletin Ilmiah Sarjana Teknik Elektro, vol. 6, no. 1, pp. 72–82, 2024, https://doi.org/10.12928/biste.v6i1.10234.
[51] M. S. Hylmi, Wiharto, and E. Suryani, “Detection of potato leaf disease using multi-class support vector machine based on texture, color, and shape features,” in International Conference on Electrical and Information Technology (IEIT), pp. 20-24, 2022, https://doi.org/10.1109/IEIT56384.2022.9967866.
[52] J. Baeg and J. Park, “Oriented bounding box detection robust to vehicle shape on road under real-time constraints,” in IEEE 26th International Conference on Intelligent Transportation Systems (ITSC), pp. 3383-3389, 2023, https://doi.org/10.1109/ITSC57777.2023.10422523
[53] J. Li, J. Wu, and Y. Shao, “FSNB-YOLOV8: Improvement of object detection model for surface defects inspection in online industrial systems,” Appl. Sci. (Basel), vol. 14, no. 17, p. 7913, 2024, https://doi.org/10.3390/app14177913.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Erwin Sitompul, Muhammad Teguh Ilham Yaqin, Hendra Jaya Tarigan, George Michael Tampubolon, Faisal Samsuri, Mia Galina

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with JITEKI agree to the following terms:
- Authors retain copyright and grant the journal the right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (CC BY-SA 4.0) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
This work is licensed under a Creative Commons Attribution 4.0 International License