ID: Password: [Register]  
Home | Research | Publications | People | Courses | Resources | Board KOREAN  
 
  • Overview
  • Agent
  • Biometrics
  • Smart Phone
  • Ubiquitous
  • Robot
  • Projects

  •   Robot

    1. Introduction

    It is one of very difficult problems to understand a scene based on image information in a large scale and uncertain real world. To overcome it, we propose a method that divides and manages the Bayesian network (BN) modules to solve the complex problem. We also suggest a BN structure design method based on activity feature, which categorize the domain knowledge, and build BN structure with them, then define several inference processes as a behavior selection network (BSN), and combine the BN and BSN as an inference model. We present a learning method of inference model, which is based on BN and logic network model, and aims to update and adapt the model by using the interactive data. We have performed several simulations to evaluate the performance of the proposed methods.

    2. Methods

    - Modular approach & Combination for recognition of uncertain situation


    Figuire 1. The concept of our research

    - Hierarchical object recognition BN design based on domain knowledge

      
      Figure 2. Design process of BN

     

    - Behavior network based BN ensemble combination

       
      Figure 3. General Concept of BSN based BN ensemble

     

    - Interactive learning based on logic network

     
     Figure 4. Interactive learning process

     

    3. Experimental Results

    - Hierarchical object recognition Bayesian network based on activity


    Figure 5. Designed Presentation Activity BN

     

    - BSN based place and object perception

     
     Figure 6. Designed BSN for Combination

     

    - Interactive learning based on logic network

     
     Figure 7. Learned Logic Network

    - Performance Test

     
     Figure 8. Webot Simulation
     

     
      Figure 9. Comparison of place recognizing probability in 6 places (corridor, dining room, toilet, seminar room, elevator, office). Correct answer is colored by black.
     

     

    4. Publications

    [1] Y.-S. Song, S.-B. Cho, and I.-H. Suh, "Activity-object Bayesian networks for detecting occluded objects in uncertain indoor environment," Int. Conf. on Knowledge-Based Intelligent Information & Engineering Systems, Sep 2005. IF02=0.515

    [2] Y.-S. Song, S.-B. Cho, "Hierarchical Bayesian networks based on activity for localizing hidden target objects in indoor environment,” Proc. of Korea Computer Congress, vol. 32, no. 1, pp. 616-618, Jul 2005.

    [3] K.-S. Hwang, H.-S. Park, S.-B. Cho, "Bayesian probability and evidence combination for improving scene recognition performance,”Proc. of Korea Computer Congress, vol. 32, no. 1, pp. 634-636, Jul 2005.

    [4] S.-B. Im, S.-B. Cho, "Place and object recognition in uncertain indoor environments using SIFT and Bayesian networks,” Proc. of Korea Computer Congress, vol. 32, no. 1, pp. 637-639, Jul 2005.

    [5] J.-O. Yoo, S.-B. Cho, "Fuzzy Bayesian network for fusion of multimodal context information,”Proc. of Korea Computer Congress, vol. 32, no. 1, pp. 631-633, Jul 2005.

    [6] K.-S. Hwang, and S.-B. Cho, "Constrained learning method of Bayesian network structure for efficient context classification," Proc. of Korea Information Science Society, vol. 31, no. 2, pp. 112-114, Oct 2004.

    [7] J.-O. Yoo, K.-J. Kim, and S.-B. Cho, "Speciated evolution of Bayesian networks ensembles for robust inference," Proc. of Korea Information Science Society, vol. 31, no. 2, pp. 226-228, Oct 2004.

    [8] J.-O. Yoo, K.-J. Kim, and S.-B. Cho, "Bayesian inference with fuzzy variables for customized high level context extraction," Proc. of Korea Information Science Society, vol. 31, no. 2, pp. 115-117, Oct 2004.






      Last update : 2015.04.06 @ Softcomputing lab info : webmaster at sclab.yonsei.ac.kr

      Soft Computing Laboratory,Dept. of Computer Science, Yonsei University,
      134 Shinchon-Dong, Seodaemoon-Gu, Seoul, 120-749, Korea
      Telephone : +82-2-2123-3877 or 4803