0% found this document useful (0 votes)
12 views

N0410030302

Uploaded by

Sung Woo Shin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

N0410030302

Uploaded by

Sung Woo Shin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Journal of Construction Automation and Robotics pISSN: 2800-0552, eISSN: 2951-116X

Vol. 3 No. 3, pp. 6-13 / September, 2024 DOI: https://doi.org/10.55785/JCAR.3.3.6

건설 현장의 효과적인 인간-로봇 커뮤니케이션을 위한 작업 형태 지시 해석


Interpreting Spatial Instructions for Effective Human-Robot Communication in
Construction Environments

이채은1 ․ 윤성부2 ․ 박문서3 ․ 안창범4


1 2 3 4
Lee, Chaeeun ․ Yoon, Sungboo ․ Park, Moonseo ․ Ahn, Changbum

Received September 11, 2024 빳 Revised September 13, 2024 빳 Accepted September 19, 2024

ABSTRACT
When utilizing robots to perform tasks in construction sites, it is important to ensure efficient communication between humans and robots based on
spatial instructions and the intuitive decision-making of the operator, in order to adapt flexibly to changes in the environment or alterations in plans.
However, it is challenging to accurately conveying the intent of the operator to the robot in complex construction environments. This becomes
particularly difficult when dealing with tasks involving objects with curvature or geometric patterns, which increase the complexity of robot trajectory
generation. Therefore, this study proposes a method to estimate the robot trajectory from spatial instruction that include deviation, specifically for
targets with various forms. It focuses on interpreting spatial instructions in circular, rectangular, and linear shapes commonly used in construction, and
the proposed approach involves two steps: classifying the trajectory shapes from spatial instruction and fitting the shapes through regression analysis.
The accuracy of this method was evaluated in a scenario where 8 participants collaborated with a robot to cut ceiling-mounted components using
spatial instructions provided through a laser pointer. As a result, the proposed method achieved an average Root Mean Squared Error of 2.398mm,
compared to 7.274mm for the conventional B-Spline method. These results suggest that the regression analysis-based approach to interpreting spatial
instructions has the potential to improve safety and productivity in construction tasks that rely on design plans and layouts.

Keyword : Human-robot collaboration, Spatial instruction, Regression analysis, Least squared method

1. 서 론 ,
, ,
,
, (Yoon et al., 2023a).
(Bock and Linner, 2016). ,
, -
, .
, ,
(Brosque et al., 2020). .

1
서울대학교 대학원 건축학과 석사과정(M.S. Student, Department of Architecture and Architectural Engineering, Seoul National University, celee1052@
snu.ac.kr)
2
서울대학교 대학원 건축학과 박사과정(Ph.D. Student, Department of Architecture and Architectural Engineering, Seoul National University, yoonsb24@
snu.ac.kr)
3
서울대학교 건축학과 교수(Professor, Department of Architecture and Architectural Engineering, Seoul National University, [email protected])
4
교신저자 ․ 서울대학교 건축학과 교수(Corresponding Author, Professor, Department of Architecture and Architectural Engineering, Seoul National University,
[email protected])

Copyright © 2024 Korean Society of Automation and Robotics in Construction. This is an Open Access article distributed under the terms of the Creative
Commons Attribution Non-Commercial License (https://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use,
distribution, and reproduction in any medium, provided the original work is properly cited.

6| Korean Society of Automation and Robotics in Construction |


건설 현장의 효과적인 인간-로봇 커뮤니케이션을 위한 작업 형태 지시 해석

“ ” . 8
(Yoon et al., 2023b)
(Sprute et al., 2019) ,
, ,
, (Root Mean Squared Error, RMSE)
.
(Yoon et al., 2023a).

2. 선행 연구 분석
, ,
- (1)
(Ercan Jenny et al., 2020). , (2) RGB-D
, 3
, (Point cloud) , (3)
(Yoon et al., 2024). , , (4)
, (Shibata et al., 2011).
(Gui et al., 2021; Zhou et al., 2021). .
,
,
(Shah et al., 2023).
(Raw data)
, . , Dagioglou et al.
(2021) ,
(Dagioglou et al., 2021; Park et al., 2012), 2 (End effector)
RGB-D
3 , (Bezier
(Li et al., 2022; Yoon et al., 2024). curves) . Braglia et al.(2023)
, ,
,
B-Spline .
. ,
,
,

(Nadhim et al., 2016). , (Ravankar et al., 2018).


(Park et al., 2015), (Yoon et al., 2024),
(Lee et al., 2022) , (Shape fitting)
, . Chao et al.(2014)
.

| Vol.3 No.3 | 7
이채은 ․ 윤성부 ․ 박문서 ․ 안창범

, (Fig. 1): (1)


Li et al.(2022) (Trajectory shape classification), (2)
(Trajectory shape fitting).
.
3.1 궤적 형상 분류
,

(Yoon et al., 2024). (e.g., )


. , ,
, 2 ,
.
, , ,

. .
, RGB-D , 3
- , .
,
. 2 3
, 3 ,

. PointNet++(Qi et al., 2017) .

3. 작업 형태 지시 해석 방법론 3.2 궤적 형상 적합
3
,

Figure 1. Process of spatial instruction-based human-robot collaboration


and trajectory shape estimation Figure 2. Shape fitting of 2D data points projected onto the ceiling

8| Korean Society of Automation and Robotics in Construction |


건설 현장의 효과적인 인간-로봇 커뮤니케이션을 위한 작업 형태 지시 해석

Table 1. Residuals for data points        … 

Parameters (  ) Residuals (  ) Remarks


Circle           : Distance between  and   
   
      
Rectangle          
min    
  
  
     
  ・
 : 2D rotation matrix for 
Line     sin    cos    Linear equation:   sin    cos     

Figure 3. Shape fitting process: (a) 2D shape fitting (b) matching with
depth information (c) conversion to a 3D trajectory Figure 4. Shapes of target objects (a) planar (b) curved (c) patterned

.
,3 2

, (Residual)

(Fig. 2).
Lev- Figure 5. Settings for data collection
enberg-Marquardt ,
Table 1 (Moré, 1978). .
,
RGB-D , ,
3 (Fig. 3). ,
, ,
5 (Fig. 4). ,
4. 실험 및 결과 분석

4.1 실험 환경 및 설정
(Sprute et al.,
2019; Yoon et al., 2024). , 8
, 3
. , RGB-D
2.0m ,
(Fig. 5).

| Vol.3 No.3 | 9
이채은 ․ 윤성부 ․ 박문서 ․ 안창범

Table 2. Hyperparameter settings

Hyperparameter Options Settings


# of points 64
Learning rate 0.001
Optimizer Adam
Figure 6. Hardware setup and the robot's field of view Decay rate 1e-4
Epochs 200

Figure 7. Dataset collection


Figure 8. Confusion matrix
Intel RealSense D435 RGB-D
. KUKA KR 6 R 900 6-DoF 4.2.2 궤적 형상 적합
(Manipulator) 640 × 480 RGB
30 (FPS)
(Fig. 6). ,

, RGB HSV(Hue- . RMSE ,


Saturation-Value) , 100
HSV ([0, 50, 70] [10, 255, 255] [170, (Sampling) ,
50, 70] [180, 255, 255]) RMSE .
. , B-Spline

3 (Fig. 7). .
,
4.2 결과 분석 ,
4.2.1 궤적 형상 분류 (Ravankar et al.,
, 5 ,3 2018). 100
3 ,
PointNet++ , ModelNet10 .
(Wu et al., 2015) Fig. 9 8
(Table 2). ,
B-Spline
, 100% .
(Fig. 8). t
(Paired t-test) .8

10 | Korean Society of Automation and Robotics in Construction |


건설 현장의 효과적인 인간-로봇 커뮤니케이션을 위한 작업 형태 지시 해석

Figure 10. Sample outcomes of shape fitting (Subject 1) on (a) the


planar target (b) the curved target (c) the patterned target

, , ,
16
,
t Table 3 .
,
B-Spline 9
.
Fig. 10 3 3
Figure 9. Results of RMSE on (a) the planar target (b) the curved target
(c) the patterned target
.
.

Table 3. Results of paired t-test

Ours Baseline
t p
(M ± SD, mm) (M ± SD, mm)
Circle 2.205 ± 0.723 5.840 ± 2.802 -4.75 <.001
Planar Rectangle 3.558 ± 1.072 9.541 ± 4.302 -5.10 <.001
Line 0.688 ± 0.252 5.900 ± 4.005 -4.92 <.001
Circle 2.230 ± 0.583 4.874 ± 2.377 -4.49 <.001
Curved Rectangle 3.860 ± 0.724 9.521 ± 4.905 -4.74 <.001
Line 0.770 ± 0.155 4.597 ± 3.100 -4.71 <.001
Circle 2.248 ± 0.599 6.073 ± 2.686 -6.07 <.0001
Patterned Rectangle 4.952 ± 0.825 12.568 ± 5.931 -5.06 <.001
Line 1.069 ± 0.440 6.555 ± 3.483 -6.22 <.0001

| Vol.3 No.3 | 11
이채은 ․ 윤성부 ․ 박문서 ․ 안창범

and Single-Task Construction Robots. Cambridge University


5. 결 론
Press, 3, pp. 14–290.
Braglia, G., Tagliavini, M., Pini, F., and Biagiotti, L. (2023). Online Motion
Planning for Safe Human–Robot Cooperation Using B-Splines and
, Hidden Markov Models. Robotics, 12(4), p. 118.
Brosque, C., Skeie, G., Orn, J., Jacobson, J., Lau, T., and Fischer, M.
(2020). Comparison of Construction Robots and Traditional Methods
. for Drilling, Drywall, and Layout Tasks. 2020 International Congress
on Human-Computer Interaction, Optimization and Robotic
Applications (HORA), pp. 1–14.
Chao, F., Chen, F., Shen, Y., He, W., Sun, Y., Wang, Z., Zhou, C., and
, , Jiang, M. (2014). Robotic Free Writing of Chinese Characters via
Human–Robot Interactions. International Journal of Humanoid
Robotics, 11(01), 1450007.
. Dagioglou, M., Tsitos, A. C., Smarnakis, A., and Karkaletsis, V. (2021).
Smoothing of human movements recorded by a single RGB-D
camera for robot demonstrations. Proceedings of the 14th PErvasive
Technologies Related to Assistive Environments Conference, pp.
, 496–501.
Ercan Jenny, S., Blum, H., Gawel, A., Siegwart, R., Gramazio, F., and
Kohler, M. (2020, October 14). Online Synchronization of Building
,
Model for On-Site Mobile Robotic Construction. 37th International
. Symposium on Automation and Robotics in Construction, Kitakyushu,
, , Japan.
Gui, C., Li, J., Tu, C. L., Wang, X.-S., and Zheng, K. (2021). Design and
Analysis of Curvature Adaptive Wall Climbing Robot. 2021 IEEE
. Far East NDT New Technology & Application Forum (FENDT), pp.
177–184.
Lee, A. Y., Seo, H. C., and Park, E. S. (2022). Development of a Manually
Operated Mobile Robot That Prints Construction Site Layouts.
. , , Machines, 10(12), Article 12.
, Li, C., Xin, B., and Wang, Q. (2022). A Laser Pointer Trajectory Recognition
Method for Human-Computer Interaction.
, Moré, J. J. (1978). The Levenberg-Marquardt algorithm: Implementation
and theory. In G. A. Watson (Ed.), Numerical Analysis, Springer
Berlin Heidelberg, 630, pp. 105–116.
.
Nadhim, E. A., Hon, C., Xia, B., Stewart, I., and Fang, D. (2016). Falls
from Height in the Construction Industry: A Critical Review of the
Scientific Literature. International Journal of Environmental
감사의 글
Research and Public Health, 13(7), Article 7.
Park, B., Cho, H., Choi, W., and Park, J. (2015). Wall cutting strategy for
2024 ( ) circular hole using humanoid robot. 2015 12th International Con-
(No. RS-2024- ference on Ubiquitous Robots and Ambient Intelligence (URAI),
pp. 564–569.
00395512)
Park, S., Yu, S., Kim, J., Kim, S., and Lee, S. (2012). 3D hand tracking
using Kalman filter in depth space. EURASIP Journal on Advances
in Signal Processing, 2012(1), 36.
References
Qi, C. R., Yi, L., Su, H., and Guibas, L. J. (n.d.). (2017). PointNet++: Deep
Hierarchical Feature Learning on Point Sets in a Metric Space.
Bock, T., and Linner, T. (Eds.). (2016). Single-Task Construction Robots https://doi.org/10.48550/arXiv.1706.02413
by Category. In Construction Robots: Elementary Technologies Ravankar, A., Ravankar, A. A., Kobayashi, Y., Hoshino, Y., and Peng, C.

12 | Korean Society of Automation and Robotics in Construction |


건설 현장의 효과적인 인간-로봇 커뮤니케이션을 위한 작업 형태 지시 해석

C. (2018). Path Smoothing Techniques in Robot Navigation: State- Recognition (CVPR), pp. 1912–1920.
of-the-Art, Current and Future Challenges. Sensors, 18(9), p. 3170. Yoon, S., Kim, Y., Park, M., amd Ahn, C. R. (2023a). Effects of Spatial
Shah, S. H., Lin, C. Y., Tran, C. C., and Ahmad, A. R. (2023). Robot Pose Characteristics on the Human–Robot Communication Using Deictic
Estimation and Normal Trajectory Generation on Curved Surface Gesture in Construction. Journal of Construction Engineering and
Using an Enhanced Non-Contact Approach. Sensors, 23(8), p. Management, 149(7), 04023049.
3816. Yoon, S., Park, J., and Park, M. (2023b). A Deictic Gesture-Based Human-
Shibata, S., Yamamoto, T., and Jindai, M. (2011). Human-robot interface Robot Interface for In Situ Task Specification in Construction. Com-
with instruction of neck movement using laser pointer. 2011 puting in Civil Engineering. https://doi.org/10.1061/9780784485
IEEE/SICE International Symposium on System Integration (SII), 224.054
pp. 1226–1231. Yoon, S., Park, M., and Ahn, C. R. (2024). LaserDex: Improvising Spatial
Sprute, D., Tonnies, K., and Konig, M. (2019). This Far, No Further: Tasks Using Deictic Gestures and Laser Pointing for Human–
Introducing Virtual Borders to Mobile Robots Using a Laser Robot Collaboration in Construction. Journal of Computing in Civil
Pointer. 2019 Third IEEE International Conference on Robotic Engineering, 38(3), 04024012.
Computing (IRC), pp. 403–408. Zhou, B., Liu, Y., Xiao, Y., Zhou, R., Gan, Y., and Fang, F. (2021). Intelli-
Wu, Z., Song, S., Khosla, A., Yu, F., Zhang, L., Tang, X., and Xiao, J., gent Guidance Programming of Welding Robot for 3D Curved
(2015). 3D ShapeNets: A deep representation for volumetric Welding Seam. IEEE Access, 9, pp. 42345–42357.
shapes. 2015 IEEE Conference on Computer Vision and Pattern

요 지

핵심용어 :

| Vol.3 No.3 | 13

You might also like