마르코프 연쇄

Noun

mareukopeu yeonswae

Markov chain

chaîne de Markov

Examples

  • 마르코프 연쇄는 확률론에서 중요한 개념입니다.

    mareukopeu yeonswae-neun hwakryulron-eseo jungyohan gaenyeom-imnida.

    Markov chains are an important concept in probability theory.

Go further

Related words

More words starting with