Neural Net ¿Í GMDH¸¦ ÀÌ¿ëÇÑ ½Ã°è¿­ ¿¹ÃøÀÇ ºñ±³*

¼ÛÇѽÄ**








I. ¿¬±¸¸ñÀû

º» ¿¬±¸´Â ½Ã°è¿­ ¿¹Ãø¿¡¼­ ´º·² ³Ý ¹æ¹ý°ú GMDH(group method of data handling) ¹æ¹ýÀ» ½ÇÇèÀûÀ¸·Î ºñ±³ÇØ º¸°í, GMDH ¹æ¹ýÀÌ ½Ã°è¿­¿¹Ãø¿¡¼­µµ À¯¿ëÇÔÀ» º¸ÀÌ°íÀÚ ÇÏ´Â °ÍÀÌ´Ù. Hill et al.(1996)ÀÇ ¿¬±¸´Â, ½Ã°è¿­ ÀÚ·á°¡ ºñ±³Àû ´Ü±âÀûÀÏ ¶§, Áï ¿¬µµº°º¸´Ù´Â ºÐ±âº° ¶Ç´Â ¿ùº°ÀÏ ¶§´Â ±âÁ¸ÀÇ Åë°èÀûÀÎ ½Ã°è¿­ ¿¹Ãø¹æ¹ýº¸´Ù´Â ´º·² ³Ý ¹æ¹ý¿¡ ÀÇÇÑ ¿¹ÃøÀÌ ¿ì¼öÇÑ °ÍÀ¸·Î º¸°íÇÏ°í ÀÖ´Ù. ´º·² ³Ý°ú ¸¶Âù°¡Áö·Î GMDH ¹æ¹ýµµ self-organizing modelling ¹æ¹ýÀÌ´Ù. µû¶ó¼­, GMDHµµ ½Ã°è¿­ ¿¹Ãø¿¡¼­ À¯¿ëÇÒ °ÍÀ̶ó°í ÃßÃøÇÒ ¼ö ÀÖ´Ù.

º» ¿¬±¸¿¡¼­´Â "M-competition"¿¡ ÀÖ´Â ½Ã°è¿­ ÀڷḦ ÀÌ¿ëÇÏ¿© GMDH ¹æ¹ý°ú ´º·² ³Ý ¹æ¹ý¿¡ ÀÇÇÑ ¿¹Ãø °á°ú¸¦ »óÈ£ ºñ±³ÇØ º¸¾Ò´Ù. ±× °á°ú GMDH´Â ´º·² ³Ý ¹æ¹ý ¸øÁö ¾Ê°Ô ¿ì¼öÇÔÀ» ¾Ë ¼ö ÀÖ¾ú´Ù. GMDH´Â ´º·² ³Ý°ú À¯»çÇϸ鼭µµ ±¸ÃàµÈ ¿¹Ãø¸ðÇüÀ» ±¸Ã¼ÀûÀÎ ÇÔ¼ö·Î Ç¥ÇöÇÒ ¼ö ÀÖ´Ù´Â ÀåÁ¡ÀÌ ÀÖ´Ù. ÀÌ·¯ÇÑ ÀåÁ¡À» ÀÌ¿ëÇÑ´Ù¸é ½Ã°è¿­ ¿¹Ãø¿¡¼­ GMDH´Â ´º·² ³ÝÀÌ Á¦°øÇÏÁö ¸øÇÏ´Â À¯¿ëÇÔÀÌ ÁÙ ¼ö ÀÖ´Ù.

IIÀý¿¡¼­´Â ½Ã°è¿­¿¹Ãø¿¡¼­ ´º·² ³ÝÀÇ ¿ì¼ö¼ºÀ» ¿ä¾àÇÏ°í, IIIÀý¿¡¼­´Â GMDH¸¦ ¼Ò°³ÇÏ°í, IVÀý¿¡¼­´Â ³ëÀÌÁî°¡ ¾ø´Â °æ¿ì¿¡ GMDH¿Í ´º·² ³ÝÀ» ºñ±³ÇÑ ¿¹½Ã¸¦ º¸¿´À¸¸ç, VÀý¿¡¼­´Â GMDH¿Í ´º·² ³Ý ¹æ¹ý¿¡ ´ëÇÑ ºñ±³¸¦ "M-competition"¿¡ ÀÖ´Â ½Ã°è¿­ÀÚ·á·Î¼­ ºñ±³ÇÏ¿´À¸¸ç, VIÀý¿¡¼­´Â °á·ÐÀ» ¿ä¾àÇÏ¿´´Ù.

______________________

** µ¿¾Æ´ëÇб³ °æ¿µÇкΠ±³¼ö

*º» ¿¬±¸´Â 1999³âµµ µ¿¾Æ´ëÇб³ ÀϹݿ¬±¸Áö¿ø(°ø¸ð°úÁ¦)¿¡ ÀÇÇÏ¿© ÀÌ·ç¾î Á³À½.

II. ½Ã°è¿­ ¿¹Ãø¿¡¼­ ´º·² ³Ý ¸ðÇüÀÇ ¿ì¼ö¼º

´º·² ³Ý ¸ðÇüÀº ¾î¶°ÇÑ ÇÔ¼öÀÌµç ±× ÇÔ¼ö¸¦ ±Ù»çÇÏ°Ô Ç¥ÇöÇÒ ¼ö ÀÖ´Ù(Funahashi 1989). ¶Ç, ´º·² ³ÝÀº ¼±Çü ¹× ºñ¼±Çü ȸ±ÍºÐ¼®, ºñ¸ð¼öÀû ȸ±ÍºÐ¼®À» ±Ù»çÇÏ°Ô Ç¥ÇöÇÒ ¼ö ÀÖ´Ù(White 1992a, 1992b). µû¶ó¼­ ¾î¶² ½Ã°è¿­ ÀÚ·á°¡ ƯÁ¤ÇÑ ÆÐÅÏÀ̳ª Ư¼ºÀ» °¡Áö°í ÀÖ°í, ±×°ÍÀ» ÇÔ¼öÀû ÇüÅ·ΠǥÇöÇÒ ¼ö ÀÖ´Ù¸é, ±× ½Ã°è¿­Àº ´º·² ³ÝÀ¸·Î ¸ðÇüÈ­ÇÏ¿© Ç¥ÇöÇÒ ¼ö ÀÖ´Ù.

ÀÌ·¯ÇÑ ÀÌ·ÐÀûÀÎ ±Ù°Å¸¦ ¹ÙÅÁÀ¸·Î ´º·² ³Ý ¸ðÇü°ú ÀüÅëÀûÀÎ ½Ã°è¿­ ¿¹Ãø¹æ¹ý °£ÀÇ ºñ±³°¡ ¸¹ÀÌ ÀÌ·ç¾îÁ³´Ù. ÀÌ·¯ÇÑ ºñ±³´Â "M-competition"(Makridakis et al. 1982)ÀÇ ½ÇÁ¦ ½Ã°è¿­ ÀڷḦ ÀÌ¿ëÇÏ¿© ½ÇÇèÇØ º¸´Â °ÍÀ̾ú´Ù. "M-competition"Àº ½Ã°è¿­ ÀÚ·á¿Í ¿¹Ãø±â°£ µ¿¾ÈÀÇ ÀڷḦ ±¸ºÐÇØ ³õ°í ÀÖ´Ù. µû¶ó¼­ ¿¹Ãø±â°£ µ¿¾ÈÀÇ °ªÀ» ¿¹ÃøÇÑ ÈÄ ±× °á°ú¸¦ ¿¹Ãø ±â°£ÀÇ ½ÇÁ¦ °ª°ú ºñ±³ÇÏ¸é ¿¹Ãø¸ðÇüÀÇ È¿°ú¼ºÀ» Æò°¡ÇÒ ¼ö ÀÖ´Ù. (½Ã°è¿­ ÀÚ·áµé°ú ¿¹Ãø±â°£ µ¿¾ÈÀÇ ½ÇÁ¦ °ªÀº ´ÙÀ½ »çÀÌÆ®¿¡ ¸ðµÎ °ø°³µÇ¾î ÀÖ´Ù: http://forecasting.cwru.edu/).

ÀÌ·¯ÇÑ M-competitionÀÇ ÀڷḦ ÀÌ¿ëÇÏ¿© Shadra & Patil(1990)ÀÇ ½ÇÇè¿¡¼­´Â ´º·² ³Ý ¸ðÇüÀÌ Box-Jenkis(Autobox)¿¡ ¸øÁö ¾ÊÀº °ÍÀ¸·Î º¸°íÇÏ¿´´Ù. Foster et al.(1991)Àº ´º·² ³Ý ¹æ¹ýÀÌ Holt's³ª Brown's ¹æ¹ý, ÃÖ¼ÒÀڽ¹æ¹ý µî¿¡ ºñÇؼ­ ¿¬µµº° µ¥ÀÌÅÍ¿¡ ´ëÇؼ­´Â ¿­µîÇÏ°í, ºÐ±âº° ÀÚ·á¿¡ ´ëÇؼ­´Â ºñ½ÁÇϸç, ¿ùº° ÀÚ·á¿¡ ´ëÇؼ­´Â ºñ±³ÇÏÁö ¸øÇß´Ù.

ÀÌ·¯ÇÑ ÀÌÀüÀÇ ºñ±³¸¦ °¡Áö°í¼­ Hill et al.(1996)Àº Back propagation ´º·² ³Ý ¸ðÇüÀ» ±âÁ¸ÀÇ ½Ã°è¿­ ¿¹Ãø¹æ¹ý°ú ü°èÀûÀÎ ºñ±³¸¦ ÇÏ¿´´Ù. ±× °á°ú¸¦ º¸¸é ´ÙÀ½ <Ç¥ 1>°ú °°´Ù. ´º·² ³ÝÀÌ ´ëºÎºÐ ¿ì¼öÇÏ°Ô ³ªÅ¸³ª ÀÖ´Ù. ºñ±³±âÁØÀº MAPE(mean absolute percentage error)À» »ç¿ëÇÏ¿´´Ù.

Hill et al.(1996)Àº ¿¬µµº°, ºÐ±âº° ¿ùº°·Î ±¸ºÐµÈ 111°³ ½Ã°è¿­ ÀڷḦ °¡Áö°í¼­ 6°³ÀÇ ±âÁ¸ÀÇ ¹æ¹ý°ú ºñ±³ÇÏ¿´´Ù. ¿¬µµº° ÀÚ·á¿¡ ´ëÇؼ­ ´º·² ³Ý ¸ðÇüÀº 3°³ÀÇ ÅõÀÔ ³ëµå 2 °³ÀÇ Áß°£³ëµå, ±×¸®°í 1 °³ÀÇ Ãâ·Â³ëµå(3-2-1 ±¸Á¶)·Î ±¸ÃàÇÏ¿´´Ù. ºÐ±âº° ÀÚ·á¿¡ ´ëÇؼ­´Â °¢°¢ 4°³, 2°³, 1°³(4-2-1 ±¸Á¶)·Î ÇÏ°í, ¿ùº° µ¥ÀÌÅÍ¿¡ ´ëÇؼ­´Â 9°³, 4°³, 1°³(9-4-1 ±¸Á¶)·Î ÇÏ¿´´Ù. ¿¹¸¦ µé¸é ¿ùº° ÀڷḦ ´º·² ³Ý ¸ðÇüÀ¸·Î ±¸ÃàÇÒ ¶§¿¡´Â t ¿ùÀÇ °ªÀ» ¿¹ÃøÇÏ´Â µ¥, Á÷ÀüÀÇ t-1 ¿ù, t-2¿ù, ...., t-9¿ù µî 9°³ÀÇ ÀڷḦ ÅõÀÔÇÏ¿© 4°³ÀÇ Áß°£ ³ëµå¸¦ °ÅÃļ­ t ¿ùÀÇ °ªÀ» ¿¹ÃøÇϵµ·Ï ÇÏ¿´´Ù. ÀÌ·¸°Ô ±¸ÃàµÈ ´º·² ³Ý ¸ðÇüÀ» Àû¿ëÇÏ¿© ¿ùº° ÀÚ·á´Â ÃßÈÄ 18°³¿ù ¿¹Ãø±â°£ÀÇ °ªÀ» ¿¹ÃøÇÏ°í, ±× ¿¹ÃøµÈ °ªÀ» ½ÇÁ¦°ª°ú ºñ±³ÇÏ¿© ÆÛ¼¾Æ® Àý´ë¿ÀÂ÷ Æò±Õ(MAPE)À¸·Î ºñ±³ÇÏ¿´´Ù. ºÐ±âº° ÀÚ·á´Â 8ºÐ±â°£À», ¿¬µµº° ÀÚ·á´Â ÇâÈÄ 6 ³â°£À» ¿¹Ãø±â°£À¸·Î ÇÏ¿´´Ù.

Ç¥¿¡¼­ º¸µíÀÌ ´º·² ³Ý¿¡ ÀÇÇÑ ¹æ¹ýÀÌ ¿ì¼öÇÏ´Ù´Â °ÍÀ» ¾Ë ¼ö ÀÖ´Ù. ´Ù¸¸, ¿ùº° ÀÚ·á¿¡¼­ ´º·²³Ý¹æ¹ýÀÌ Áö¼öÆòÈ°¹ýº¸´Ù´Â ¿ì¼öÇÏ°Ô º¸ÀÌÁö¸¸, Åë°èÀû °ËÁõÀ¸·Î´Â Â÷À̸¦ º¸ÀÌÁö ¾Ê¾Ò´Ù. Hill et al.(1996)Àº ´º·² ³ÝÀ» Àû¿ëÇÏ´Â µ¥, ½Ã°è¿­ ÀÚ·á¿¡¼­ °èÀý¼ºÀ» ¾ø¾Ø ´ÙÀ½, ÃÖ´ë °ª°ú ÃÖ¼Ò°ªÀ» °¡Áö°í¼­ ´Ù½Ã scaling ÇÑ ´ÙÀ½, ±×°ÍÀ» ´º·² ³Ý ÇÐÇнÀ¿¡ ÅõÀÔÇÏ¿´´Ù.

¿¹Ãø¹æ¹ý

ÀÚ·á ±â°£

¿¬µµº°

ºÐ±âº°

¿ùº°

´º·² ³Ý

14.2

15.3

13.6

Áö¼öÆòÈ°¹ý

15.9*

18.7*

15.2

Box-Jenkins

15.7

20.6*

16.4*

Holt's

12.1

26.9*

19.2*

±×·¡ÇÈ ¹× Àΰ£ÆÇ´Ü

12.5*

20.5*

16.3*

Naive

16.4*

20.0*

27.0*

ºñ±³ ´ë»ó Æò±Õ

15.0

22.6*

17.1*

<Ç¥ 1 > ´º·² ³Ý°ú ±âÁ¸ ¹æ¹ýÀÇ ºñ±³

( MAPE °ª. Hill et al. 1996 )

*´Â t-Å×½ºÆ®¿¡¼­ ´º·² ³ÝÀÌ ¿ì¼öÇÔÀ» ³ªÅ¸³¿.

III. GMDHÀÇ ¼Ò°³

GMDH´Â 1969 ·¯½Ã¾ÆÀÇ ¼öÇÐÀÚ Alex G. Ivakhnenko¿¡ ÀÇÇؼ­ Á¦¾ÈµÇ¾ú°í, Madala & Ivakhnenko(1994)ÀÇ Ã¥°ú Farlow(1984)ÀÇ Ã¥¿¡¼­ ¼Ò°³ÇÏ°í ÀÖ´Ù.

GMDH´Â <±×¸² 1>°ú °°ÀÌ ÁغñµÈ ÀÚ·á¿¡¼­ µ¶¸³º¯¼ö X1, X2, ..., Xm ·Î½á Á¾¼Óº¯¼ö Y¸¦ ±Ù»çÇÏ°Ô Ç¥ÇöÇÏ´Â ÇÔ¼ö ¸¦ Ž»öÀûÀ¸·Î ã´Â ¹æ¹ýÀÌ´Ù. ±×¸²Àº (X1, X2, ..., Xm : Y) µ¥ÀÌÅÍ ·¹Äڵ尡 N°³ ¼¼Æ® ÁغñµÈ °ÍÀ» Training set¿Í Test setÀ¸·Î µÎ °³·Î ³ª´©¾î¼­ È°¿ëÇÏ´Â °ÍÀ» º¸¿©ÁØ´Ù. Training setÀº ÀûÇÕÇÑ ÇÔ¼ö¸¦ ã´Â µ¥ È°¿ëÇÏ°í, Test setÀº ã¾ÆÁø ÇÔ¼ö¸¦ Æò°¡ÇÏ´Â µ¥ È°¿ëÇÑ´Ù.

GMDH ¹æ¹ý¿¡¼­´Â ¿ì¼± µ¶¸³º¯¼ö¸¦ µÎ°³ ¾¿ ½ÖÀ» Á¶ÇÕÇÏ¿© Á¾¼Óº¯¼ö °ªÀ» ±Ù»çÇÏ°Ô Ç¥ÇöÇÏ´Â ÇÔ¼ö¸ðÇüÀ» ¸¸µç´Ù. ±× ÇÔ¼ö¸ðÇüÀ¸·Î½á Á¾¼Ó º¯¼ö °ªÀ» °¡±õ°Ô ÃßÁ¤ÇÏ´Â ½Ö(Áï, ±× ÇÔ¼ö¸ðÇü)Àº »ýÁ¸½ÃÅ°°í ±×·¸Áö ¸øÇÑ ½ÖÀº µµÅ½ÃÅ°´Â ÀûÀÚ»ýÁ¸(the fittest survives)À» Àû¿ëÇÑ´Ù. ÀûÀÚ»ýÁ¸ÀÇ ±âÁØÀº º°µµ·Î Á¤ÇÑ´Ù. »ì¾Æ³²Àº °¢°¢ÀÇ ½ÖÀ¸·Î Ç¥ÇöµÇ´Â ÇÔ¼ö´Â ±× ´ÙÀ½ ¼¼´ë(¶Ç´Â °£´ÜÈ÷ ÈļÕ)¸¦ Çü¼ºÇÑ´Ù. ´ÙÀ½ ¹ø ¹Ýº¹¿¡¼­´Â ÀÌ Èļյé·Î½á ´Ù½Ã µÎ °³¾¿ ½ÖÀ» Á¶ÇÕÇÏ¿© Á¾¼Ó º¯¼ö°ªÀ» ÃßÁ¤ÇÏ´Â ÇÔ¼ö¸¦ ¸¸µç´Ù. ÀÌ·¸°Ô ¼¼´ë¸¦ °ÅµìÇϸ鼭 »ýÁ¸°ú µµÅ¸¦ °è¼ÓÇÏ¿© Á¾¼Óº¯¼ö °ªÀ» °¡±õ°Ô ÃßÁ¤ÇØ ³ª°¡µÇ, ´õ ÀÌ»ó °³¼±ÀÌ ÀÌ·ç¾îÁöÁö ¾ÊÀ¸¸é ¼¼´ë±³Ã¼¸¦ Áß´ÜÇÏ°Ô µÈ´Ù. ÀÌ·¯ÇÑ °úÁ¤À» ¼¼´ë ±³Ã¼, ÀûÀÚ»ýÁ¸ÀÇ Æò°¡±âÁØ°ú Stopping Rule, À̹ٳÙÄÚ ´ÙÇ×½Ä(Ivakhnenko polynomial) À¸·Î ³ª´©¾î ¼³¸íÇÑ´Ù.

3.1. ¼¼´ë ±³Ã¼

Çö ¼¼´ë¿¡¼­ ´ÙÀ½ ¼¼´ë·Î ÁøÀüÇϱâ À§Çؼ­ Á¾¼Óº¯¼ö Y¸¦ ´ÙÀ½°ú °°Àº '±âº»½Ä(primitive equations)'À¸·Î ÃßÁ¤ÇÑ´Ù. ¿©±â¼­ Xg, Xh´Â Çö ¼¼´ë¿¡¼­ Á¶ÇÕÇÑ ÀÓÀÇÀÇ µÎ °³ º¯¼öÀÌ°í, °è¼ö A, B, C, D, E, F ´Â ¿¡·¯ ÀÚ½ÂÇÕÀ» ÃÖ¼ÒÈ­ÇÏ´Â ÃÖ¼ÒÀڽ¹ý(Least Square Method)À¸·Î ±¸ÇÑ´Ù(À̸¦ 'À̹ٳÙÄÚ °è¼ö'·Î ºÎ¸¥´Ù.). ÃÖ¼ÒÀڽ¹ýÀ» Àû¿ëÇÒ ¶§¿¡´Â Training set ¼ÓÀÇ ÀڷḦ ÀÌ¿ëÇÑ´Ù.

= A + BXg + CXh + DXg2 + EXh2 + FXg*Xh

ÀÌ·¸°Ô ÃßÁ¤ÇÑ ½Ä¿¡ ÀÇÇÑ ÀÇ °ªµéÀº ºÎ¸ð ¼¼´ëÀÇ Xg ¿Í Xh °ªµéº¸´Ù´Â Y °ª¿¡ ´õ ±ÙÁ¢ÇÏ°Ô µÉ °ÍÀ¸·Î ÃßÃøÇÒ ¼ö ÀÖ´Ù. µû¶ó¼­ µé·Î¼­ ´Ù½Ã Çѹø ´õ Y ÃßÁ¤ÇÏ´Â ½ÄÀ» ÇÔ¼ö¸¦ ãÀ¸¸é ´õ ±ÙÁ¢ÇÑ °ÍÀ» ¾òÀ» ¼ö ÀÖÀ» °ÍÀ¸·Î ±â´ëÇÒ ¼ö ÀÖ´Ù. ±×·¡¼­ ·Î¼­ »õ·Î¿î ¼¼´ë(Á¦2¼¼´ë)¸¦ Çü¼ºÇÏ°í, ±× ´ÙÀ½ ¼¼´ë(Á¦3¼¼´ë)¸¦ ã´Â ¼¼´ë±³Ã¼¸¦ ¹Ýº¹ÇÑ´Ù. <±×¸² 2>¿¡¼­´Â ¸¦ ´ÙÀ½ ¼¼´ë¿¡¼­ Zi À¸·Î Ç¥ÇöÇÏ¿´´Ù.

Á¦1¼¼´ëÀÇ º¯¼ö°¡ m°³¶ó¸é, Á¦2¼¼´ë¿¡´Â ÃÖ´ë mC2 = m(m-1)/2 °³¼ö¸¸Å­ÀÇ ÈļÕÀ» »ý¼ºÇÏ°Ô µÈ´Ù. µû¶ó¼­ ¼¼´ë±³Ã¼¸¦ °è¼ÓÇÒ¼ö·Ï ÈļÕÀº ±âÇϱ޼öÀûÀ¸·Î Áõ°¡ÇϰԵȴÙ. GMDH¿¡¼­´Â ÀÌ°ÍÀ» ÀûÀÚ »ýÁ¸°ú ºÎÀûÀÚ µµÅ ¹æ¹ýÀ¸·Î Á¶ÀýÇÏ°Ô µÈ´Ù. ¼¼´ë±³Ã¼¸¦ ¹Ýº¹Çϸ鼭 ´õ ÀÌ»ó °³¼±ÀÌ ¾øÀ» ¶§¿¡´Â ¼¼´ë±³Ã¼¸¦ Áß´ÜÇÏ°í ÇöÀç±îÁöÀÇ ÈÄ¼Õ Áß¿¡¼­ ÃÖÀûÀÇ °ÍÀ» ¼±ÅÃÇÑ´Ù.

3.2. ÀûÀÚ»ýÁ¸ÀÇ Æò°¡±âÁØ°ú Stopping Rule

¼¼´ë ±³Ã¼¸¦ ¹Ýº¹ÇÏ´Â °úÁ¤¿¡¼­ ºÎÀûÇÕÇÑ ÈļÕÀº µµÅ½ÃÄѼ­ ´õ ÀÌ»ó ÈļÕÀ» »ý¼ºÇÏÁö ¸øÇÏ°Ô ÇÑ´Ù. À̶§, ÈļÕÀÇ »ýÁ¸°ú µµÅ ¿©ºÎ¸¦ Æò°¡(the test of goodness of fit) Çϱâ À§Çؼ­ ´ÙÀ½ ½Ä°ú °°Àº Æò°¡±âÁØ(Regularity Criterion)À» »ç¿ëÇÑ´Ù. ÀÓÀÇÀÇ ±âÁØ°ª RÀ» ¹Ì¸® ¼³Á¤ÇÏ°í »ý¼ºµÈ ÈÄ¼Õ Zj¿¡¼­ °è»êÇÑ °¡ ¡Â R ÀÌ¸é »ýÁ¸½ÃÅ°°í, > RÀÌ¸é µµÅ½ÃŲ´Ù.

,

j = 1, 2, ...., : »õ·Ó°Ô »ý¼ºµÈ ¼¼´ë³»ÀÇ °¢ ÈļյéÀ» ³ªÅ¸³»´Â ÷ÀÚ

i = nt+1, nt+2, .............N : Test set ÷ÀÚ

: °°Àº ¼¼´ë ¾È¿¡¼­ j ¹ø° »ý¼ºµÈ ÈļÕÀÇ i ¹ø° ¿ø¼Ò°ª

ÇÑÆí, Çö ¼¼´ë kÀÇ °¢°¢ÀÇ Èļյ鿡°Ô¼­ °è»êÇÑ Æò°¡°ª Áß¿¡¼­ ÃÖ¼ÒÀÎ °ª RMINk Àº < ±×¸² 3 >°ú °°ÀÌ Á¡Á¡ °¨¼ÒÇÏ´Ù°¡ ´Ù½Ã Áõ°¡ÇÏ°Ô µÈ´Ù. RMINk °ªÀÌ ÃÖ¼Ò¿¡ À̸¥ ¼¼´ë¿¡¼­ °ªÀÌ ÃÖ¼ÒÀÎ ÈļÕÀÌ ÃÖÀûÀÇ ÈļÕÀÌ µÈ´Ù.

ÀڷḦ Training set°ú Test setÀ¸·Î ³ª´©¾ú±â ¶§¹®¿¡ RMINk Àº °è¼Ó °¨¼ÒÇÏÁö ¾Ê°í ¾î´À ¼ø°£ ´Ù½Ã Áõ°¡ÇÏ°Ô µÇ¾î °ú´Ù ¸ðÇüÈ­(over-fitting ¶Ç´Â over-specification)¸¦ ¹æÁöÇÏ°Ô µÈ´Ù(Madala &Ivakhnenko, 1994, p.11).



3.3. À̹ٳÙÄÚ ´ÙÇ×½Ä

ÃÖÀû Èļտ¡´Â ±âº»½ÄÀÌ ¿©·¯ ¹ø ÁßøµÇ¾î ³»ÀçµÇ¾î ÀÖ´Ù. < ±×¸² 4 >¿¡¼­ º¸µíÀÌ Xi¿Í Xj°¡ ÇϳªÀÇ ÈÄ¼Õ U¸¦ ³º°í, Xk¿Í XlÀÌ ¶Ç ÇϳªÀÇ ÈÄ¼Õ V À» ³ºÀº ÈÄ, ±× ÈÄ¼Õ U¿Í V°¡ ¦À» ÀÌ·ç¾î ÈÄ¼Õ W¸¦ ³º´Â °æ¿ì, W°¡ ÃÖÀûÀÇ ÈļÕÀ̶ó°í Çغ¸ÀÚ. ±×·¯¸é, W´Â U¿Í V·Î Ç¥ÇöµÇÁö¸¸, ½ÇÁ¦·Î´Â Xi, Xj, Xk, XlÀ¸·Î ÀÌ·ç¾îÁø ´ÙÇ×½ÄÀ» °®°Ô µÈ´Ù. ÀÌó·³ ±âº»½ÄÀÌ ¿©·¯¹ø ÁßøµÈ °ÍÀ» µ¶¸³º¯¼öÀÇ ´ÙÇ×½ÄÀ¸·Î Ç¥ÇöÇÑ °ÍÀ» À̹ٳÙÄÚ ´ÙÇ×½Ä(Ivakhnenko Polynomial)À̶ó°í ºÎ¸¥´Ù.










< ±×¸² 4 > ±âº»½ÄÀÇ Áßø°ú À̹ٳÙÄÚ ´ÙÇ×½Ä

GMDH´Â ´º·² ³Ý°úµµ ¸Å¿ì À¯»çÇÏ´Ù. ¾Õ¿¡¼­ ¼³¸íÇÑ ±âº»½Ä(primitive equation)Àº µÎ °³°¡ ÅõÀÔ³ëµå¿Í ÇϳªÀÇ »êÃâ ³ëµå¸¦ °¡Áø ´º·² ³ÝÀ¸·Î Ç¥ÇöÇÒ ¼ö ÀÖ´Ù (Áß°£ ³ëµå´Â ÀÓÀÇÀÇ °³¼ö ¸¸Å­ ÀÖÀ½). GMDH¿¡ ÀÇÇÑ ¸ðÇüÀº ´º·² ³Ý ¸ðÇüÀ» ¾ÐÃàÇÑ ÇüÅ°¡ µÈ´Ù. ¶Ç, ´º·² ³ÝÀ¸·Î Ç¥ÇöµÇ´Â ÇÔ¼ö´Â ¸ðµÎ GMDH·Îµµ ±¸ÃàÇÒ ¼ö ÀÖ´Ù.

GMDHÀÇ ÀÀ¿ë»ç·Ê´Â ¸¹ÀÌ ³ªÅ¸³ª°í ÀÖ´Ù. Ravandra, et al. (1994)ÀÇ ±â°è °ø±¸ÀÇ ¼ö¸íÀ» ÃßÁ¤ÇÏ´Â µ¥ GMDH¸¦ ÀÌ¿ëÇÏ¿´°í, ±â°è°¡°ø¿¡¼­ ÃÖÀûÀÇ °¡°ø¹æ¹ýÀ» ã´Â µ¥ GMDH¸¦ ÀÌ¿ëÇÑ »ç·Ê(Nagasaka et al. 1980, Yashida et al. 1986)µµ ÀÖ´Ù. GMDH¸¦ ¼Ò°³ÇÑ Farlow(1984)ÀÇ Ã¥¿¡¼­ »ç·Ê·Î ¼Ò°³µÈ GMDHÀÇ ÀÀ¿ë»ç·Ê´Â ÅÂdzÀÇ Áø·Î¿Í °­¹°ÀÇ È帧¿¡ ´ëÇÑ ºñ¼±Çü ¿¹Ãø(Ikeda, 1984), ¹Ì±¹ÀÇ ÀÌÀÚÀ² ¿¹Ãø(Ohashi,1984), °æÁ¦ ¸ðµ¨¿¡ ÀÀ¿ë(Scott and Hutchinson, 1984) µî ´Ù¾çÇÑ ºÐ¾ß¿¡¼­ ÀÀ¿ëµÇ°í ÀÖ´Ù.


IV. ³ëÀÌÁî°¡ ¾ø´Â °æ¿ìÀÇ GMDH¿Í ´º·² ³ÝÀÇ ºñ±³

³ëÀÌÁî(noise)°¡ ÀüÇô ¾ø´Â µ¥ÀÌÅÍ(exact data)¿¡¼­ GMDH¿Í ´º·² ³ÝÀ» ºñ±³ÇØ º¸¾Ò´Ù. ±× ¿¹´Â Ä«ÀÌÀڽ ºÐÆ÷¸¦ ÀÌ¿ëÇÏ¿´´Ù. Ä«ÀÌÀڽ ºÐÆ÷´Â ÀÚÀ¯µµ¿¡ µû¶ó ÇÔ¼ö¸ð¾çÀÌ ´Þ¶óÁø´Ù. µû¶ó¼­ ÀÚÀ¯µµ¸¦ ´©ÀûÈ®·ü °ª°ú Ä«ÀÌ°ª(¥ö)ÀÇ ¿ªÇÔ¼ö·Î Ç¥ÇöÇÒ ¼ö ÀÖ´Ù. ±×·¯³ª ±×°ÍÀ» ±¸Ã¼ÀûÀÎ ÇÔ¼ö½ÄÀ¸·Î Ç¥ÇöÇϱâ´Â ¾î·Æ´Ù. µû¶ó¼­ ´©ÀûÈ®·ü¿¡ µû¸¥ Ä«ÀÌ °ªÀ» µ¶¸³º¯¼ö·Î ÇÏ¿© ÀÚÀ¯µµ¸¦ ÃßÁ¤ÇÏ´Â °ÍÀº GMDH ¹æ¹ý°ú ´º·² ³Ý ¹æ¹ýÀ» ºñ±³Çغ¸´Â ÁÁÀº ¿¹°¡ µÈ´Ù. ¥ö(n,1-p)À» ÀÚÀ¯µµ°¡ nÀÏ ¶§, ´©ÀûÈ®·üÀÌ 1-pÀÎ Ä«ÀÌ°ªÀ̶ó°í ÇÏ°í, p°¡ °¢°¢ 0.025, 0.05, 0.5, 0.90, 0.95, 0.975 ÀÏ ¶§ÀÇ Ä«ÀÌ°ª ¥ö(n,1-p) °ªµéÀ» °¡Áö°í ÀÚÀ¯µµ nÀ» ÃßÃøÇØ ³»´Â °ÍÀÌ´Ù.

¾Æ·¡´Â GMDH¿Í ´º·² ³ÝÀ» Àû¿ëÇϱâ À§ÇÑ ¿¢¼¿Ç¥ÀÌ´Ù. GMDH¸¦ Àû¿ëÇÒ ¶§¿¡´Â, µ¶¸³º¯¼öÀÇ °³¼ö¸¦ 6°³·Î ÇÏ°í, ¸ðµÎ 40°³ÀÇ µ¥ÀÌÅ͸¦ ÁغñÇÏ¿© óÀ½ 20°³¸¦ Training set, ±× ÀÌÈÄÀÇ 10°³¸¦ Test set, ±× ÀÌÈÄÀÇ 10°³(ÀÚÀ¯µµ 31¿¡¼­ 40±îÁö)¸¦ ¿¹Ãø ´ë»óÀ¸·Î ÇÏ¿´´Ù.

´º·² ³ÝÀº Ward Systems»çÀÇ NeuroShell 2.0 ¾ÆÄ«µ¥¹Í ¹öÀüÀ» ÀÌ¿ëÇÏ¿´´Ù. ´º·² ³ÝÀ» Àû¿ëÇÒ ¶§¿¡ Learning rate 0.6, momentum 0.9, ÅõÀÔ ³ëµå 6°³, hidden layer 8°³·Î ÇÏ°í, ÀÚÀ¯µµ¸¦ 1¿¡¼­ºÎÅÍ 40±îÁö ÀÚ·á·Î¼­ ÁغñÇÏ¿© 20°³±îÁö¸¦ training set, 21-30À» test set À¸·Î ÇÑ ´ÙÀ½, 31-40±îÁöÀÇ ÀÚÀ¯µµ¸¦ ¿¹ÃøÇÏ¿´´Ù.

<±×¸² 5>´Â µÎ °¡Áö ¹æ¹ý¿¡ ÀÇÇؼ­ ¾ò¾îÁø ÀÚÀ¯µµ¸¦ º¸¿©ÁÖ°í, ±×°ÍÀ» ±×·¡ÇÁ·Î ±×¸° °ÍÀÌ´Ù. GMDH´Â ÀÚÀ¯µµ 30±îÁö¸¦ Á¤È®È÷ Ç¥ÇöÇÏ°í ÀÖÀ¸¸ç, 31-40±îÁöµµ ¸Å¿ì Á¤È®ÇÏ°Ô ¿¹ÃøÇÏ°í ÀÖ´Ù. ±×·¯³ª ´º·² ³ÝÀº test set(ÀÚÀ¯µµ21-30)¿¡¼­ ¿ÀÂ÷°¡ Áõ°¡Çϱ⠽ÃÀÛÇÏ¿© ¿¹ÃøºÎºÐ(ÀÚÀ¯µµ31-40)¿¡¼­´Â ¿ÀÂ÷°¡ Á¡Á¡ ´õ Ä¿Á³´Ù. ÀÌ ÇÑ°¡Áö ¿¹½Ã·Î¼­ ´ÜÁ¤ÇÒ ¼ö´Â ¾øÁö¸¸ ³ëÀÌÁî°¡ ¾ø´Â °æ¿ì´Â, extrapolation¿¡¼­(½Ã°è¿­ ¿¹ÃøÀº ÀÏÁ¾ÀÇ extrapolationÀÓ) GMDH°¡ ´º·² ³Ýº¸´Ù ¿ì¼öÇÏ´Ù´Â °ÍÀ» ¾Ë ¼ö ÀÖ´Ù.

Âü°í·Î <Ç¥ 3>Àº GMDH¿¡ ÀÇÇÑ 'À̹ٳÙÄÚ °è¼ö'ÀÌ´Ù. °¡¿îµ¥ ÇàÀÇ '2, 2, 6'Àº 2¹ø°¿Í 6¹ø° µ¶¸³º¯¼ö(Áï, 0.05 Ä«ÀÌ°ª°ú 0.975 Ä«ÀÌ°ªÀ»)¸¦ µÎ °³ÀÇ º¯¼ö·Î »ç¿ëÇÏ°í, ±× ÇàÀÇ °è¼ö¸¦ À̹ٳÙÄÚ ±âº»½Ä¿¡ Àû¿ëÇÏ¿© ¾ò¾îÁø °ÍÀ» Á¦2 ¼¼´ëÀÇ Á¦15¹ø ÈļÕÀ¸·Î »ï´Â´Ù´Â °ÍÀ» Ç¥½ÃÇÑ´Ù. ±×·¸°Ô ÇÏ¿© ±¸ÇØÁø °ªÀ» Á¦ 3¼¼´ëÀÇ ÈļÕÀ» »ý¼ºÇÒ ¶§ÀÇ ÇϳªÀÇ '¼±Á¶(Ý«)'·Î »ç¿ëÇÑ´Ù. ¼¼¹ø° ÇàÀÇ '2, 5, 6'Àº 5¹ø° µ¶¸³º¯¼ö¿Í 6¹ø° µ¶¸³º¯¼ö¸¦ °¡Áö°í Á¦2 ¼¼´ëÀÇ Á¦21¹ø ÈļÕÀ» ¸¸µå´Â °ÍÀ» ³ªÅ¸³»°í, ±× ÇàÀº À̹ٳÙÄÚ °è¼ö¸¦ ³ªÅ¸³½´Ù. ¿©±â¼­ ¾òÀº °ªÀº Á¦3¼¼´ëÀÇ ÈļÕÀ» »ý¼ºÇÒ ¶§ ÇϳªÀÇ '¼±Á¶(Ù½)'·Î »ç¿ëÇÑ´Ù. ù ¹ø° ÇàÀÇ 3, 15, 21Àº Á¦15¹ø Èļհú 21¹ø ÈļÕÀ» ÀÌ¿ëÇÏ¿© ´ÙÀ½ ¼¼´ë ÈÄ¼Õ Á¦108¹ø ÈļÕÀ» ¸¸µå´Â °ÍÀ» ³ªÅ¸³»°í, ±× ÇàÀÇ °è¼ö´Â À̹ٳÙÄÚ °è¼öÀÌ´Ù.

°á°úÀûÀ¸·Î ¥ö(n,0.05), ¥ö(n,0.95),¥ö(n,0.975) ¼¼ °³ÀÇ µ¶¸³º¯¼ö·Î½á ÀÚÀ¯µµ¸¦ ÃßÁ¤ÇÏ´Â ½ÄÀ» ãÀº ¼ÀÀÌ´Ù.


V. ½Ã°è¿­ ¿¹Ãø¿¡¼­ ºñ±³

½Ã°è¿­ ¿¹Ãø¿¡¼­ ´º·² ³Ý°ú GMDH ¹æ¹ýÀ» ºñ±³ÇØ º¸¾Ò´Ù. ´º·² ³ÝÀ» Àû¿ëÇÏ´Â µ¥ ¿©·¯ °¡Áö ±â¼úÀûÀÎ ¼±ÅÃ(¿É¼Ç)ÀÌ ¸¹±â ¶§¹®¿¡ Hill, et al. (1996)¿¡¼­ Àû¿ëÇÑ ¹æ¹ýÀ» ±×´ë·Î ÀçÇöÇؼ­ Àû¿ëÇϱⰡ ¾î·Á¿ü´Ù. µû¶ó¼­ NeuroShell 2ÀÇ ¿É¼ÇÀ» ÃÖ´ëÇÑ È°¿ëÇÏ¿´´Ù. ºñ±³ÀÇ ¸ñÀûÀº ¾î´À ÇÑ ¹æ¹ýÀÌ ´Ù¸¥ ÇÑ ¹æ¹ýº¸´Ù ¿ì¼öÇÏ´Ù´Â °ÍÀ» º¸À̱⺸´Ù´Â GMDH ¹æ¹ýµµ ´º·² ³Ý ¸øÁö ¾Ê°Ô È°¿ëÇÒ ¸¸ÇÏ´Ù´Â °ÍÀ» º¸¿©ÁÖ±â À§ÇÑ °ÍÀÌ´Ù. µû¶ó¼­ ¿¹Ãø±â°£ÀÇ °ª¿¡ °¡±õ°Ô Á¢±ÙÇÏ´Â ¿¹Ãø¸ðÇüÀ» ã±âº¸´Ù´Â ¾Æ·¡ÀÇ ½ÇÇè ±ÔÄ¢À» Á¤Çؼ­ ±×°ÍÀ» ¾ö°ÝÇÏ°Ô ÁöÅ°´Â °ÍÀ» Áß½ÃÇÏ¿´´Ù.

5.1. ºñ±³¹æ¹ý

M-competition¿¡ ÀÖ´Â ½Ã°è¿­ ÀڷḦ °¡Áö°í GMDH ¹æ¹ý°ú ´º·² ³Ý ¹æ¹ýÀ¸·Î ¿¹ÃøÇÏ°í ±× °á°ú¸¦ ½ÇÁ¦ °ª°ú ºñ±³ÇÏ¿´´Ù. M-competition¿¡´Â ¸ðµÎ 111°³ÀÇ ÀÚ·á°¡ ÀÖ´Ù. ±× Áß¿¡¼­ ¿ùº° ÀÚ·á 15°³, ºÐ±âº° ÀÚ·á 9°³¸¦ ÀÓÀÇ·Î ÃßÃâÇÏ¿´´Ù. ÀÚ·á ³»¿¡¼­ µ¥ÀÌÅÍ °³¼ö°¡ Áö³ªÄ¡°Ô ÀûÀº °Í(30°³ À̳»)Àº ÇÇÇÏ¿´´Ù. ¿¹¸¦ µé¸é M-competitionÀÇ ½Ã°è¿­ ÀÚ·á¿¡´Â MNM6¶ó´Â ÀÚ·á°¡ ÀÖ´Ù. MNM6¿¡´Â ¸ðµÎ 75°³ ¿ù Ä¡ÀÇ µ¥ÀÌÅÍ°¡ ÀÖ´Ù(ÇÁ¶û½º ÇÑ È¸»çÀÇ 1978³âºÎÅÍÀÇ ¿ùº° ÀÚ·áÀÓ). ÀÌ 75°³ÀÇ °ªÀ» °¡Áö°í ÇâÈÄ 18°³¿ùÀÇ °ªÀ» ¿¹ÃøÇÑ ÈÄ, ½ÇÁ¦·Î ÁÖ¾îÁø 18°³ÀÇ ½ÇÁ¦°ª°ú ºñ±³ÇÏ¿© ¿¡·¯ÀÇ Á¤µµ¸¦ ºñ±³ÇÏ´Â °ÍÀÌ´Ù. Hill et al.(1996)ÀÇ ¿¬±¸¿Í ¸¶Âù°¡Áö·Î ¿¡·¯´Â Àý´ë¿ÀÂ÷ Æò±ÕÄ¡(MAPE)·Î °è»êÇÏ¿© ºñ±³ÇÏ¿´´Ù.

MAPE =

¿©±â¼­ ´Â °¢°¢ ¿¹Ãø±â°£ µ¿¾ÈÀÇ ½ÇÁ¦°ª°ú ¿¹Ãø°ªÀÌ°í, H´Â ¿ùº° ÀÚ·á¿¡¼­´Â 18, ºÐ±âº° ÀÚ·á¿¡¼­´Â 8ÀÌ´Ù. ¿©±â¼­ ÇÑ°¡Áö ¾ð±ÞÇÒ °ÍÀº, ¿¹Ãø±â°£ µ¿¾ÈÀÇ °ªÀ» ¿¹ÃøÇÒ ¶§, Àü±â(îñÑ¢)¿¡ ¿¹ÃøµÈ °ª( )À» È°¿ëÇÏ´Â °Í°ú, Àü±â(îñÑ¢)ÀÇ ½ÇÁ¦ °ª( )À» ÀÌ¿ëÇÏ´Â °ÍÀÇ Â÷ÀÌÀÌ´Ù. ´º·² ³ÝÀ¸·Î ½ÇÇèÇÒ ¶§¿¡´Â, ¿¹Ãø±â°£µ¿¾È¿¡ Á÷ÀüÀÇ ½ÇÁ¦ °ª( )À» È°¿ëÇÏ¿© ´ÙÀ½ ½Ã±âÀÇ °ªÀ» ¿¹Ãø( )ÇÏ¿´´Ù(<Ç¥ 5>¿¡¼­ c.MAPE(%)). ÀÌ°ÍÀº NeuroShell 2¸¦ Àû¿ëÇÏ´Â °úÁ¤¿¡¼­ ÆíÀǸ¦ À§ÇÑ °ÍÀ̾ú´Ù. ±×·¯³ª, GMDH·Î ½ÇÇèÇÒ ¶§¿¡´Â ¿Í ¸¦ È°¿ëÇÏ´Â µÎ °¡Áö °æ¿ì¸¦ ¸ðµÎ Æò°¡ÇÏ¿´´Ù(<Ç¥ 5>¿¡¼­ a.MAPE(%), b.MAPE(%)).

5.2. GMDHÀÇ Àû¿ë

GMDH¹æ¹ýÀº Á÷Á¢ ÇÁ·Î±×·¥À» ÀÛ¼ºÇÏ¿´±â ¶§¹®¿¡ ¿©·¯ °¡Áö À¶Å뼺ÀÌ ¸¹¾Ò´Ù. ±× Áß¿¡¼­ ¸î °¡Áö ¼±Åà »çÇ×Àº ´ÙÀ½°ú °°´Ù.

5.2.1 µ¶¸³º¯¼ö °³¼ö

½Ã°è¿­ ÀÚ·á¿¡¼­ °ú°Å ±â°£ÀÇ °ªÀ» µ¶¸³º¯¼ö·Î ÇÏ¿´´Ù. ¿ùº° ÀÚ·á¿¡¼­´Â y(t-1), y(t-2), ...., y(t-9), ¹× t¸¦ µ¶¸³º¯¼ö·Î ÇÏ°í y(t) °ªÀ» Á¾¼Óº¯¼ö·Î °£ÁÖÇÏ¿´´Ù. ¿©±â¼­ t´Â ½Ã°£ÃàÀ» ÇϳªÀÇ º¯¼ö·Î Ãß°¡ÇÑ °ÍÀ» ¸»ÇÑ´Ù. ½Ã°£ÃàÀº ´Ü¼øÈ÷ 1, 2, 3, 4, ...À» ¼ø¹øÀ¸·Î ³ª¿­ÇÑ °ÍÀ» »ç¿ëÇÏ¿´´Ù. ¿ùº° µ¥ÀÌÅÍ Áß¿¡¼­µµ µ¥ÀÌÅÍ °³¼ö°¡ ÀûÀº °ÍÀº ½Ã°£Ãà ¿ÜÀÇ µ¶¸³º¯¼ö °³¼ö¸¦ 6°³ ¶Ç´Â 7°³·Î ÁÙ¿´´Ù(MNI22, MNB20, MNB29, MNB65 µî). ºÐ±âº° µ¥ÀÌÅÍ¿¡ ´ëÇؼ­´Â µ¶¸³º¯¼ö ¼ö¸¦ 6°³·Î Á¤ÇÏ¿´´Ù.

5.2.2 ÀÚ·áÀÇ ±¸ºÐ

½Ã°è¿­ ÀڷḦ training set¿Í test setÀ¸·Î ±¸ºÐÇÏ¿´´Ù. µÎ °³ ¼¼Æ®·Î ±¸ºÐÇÏ´Â µ¥¿¡´Â ¼¼ °¡Áö ¹æ¹ýÀ» °í·ÁÇÏ¿´´Ù. ù°, ºÐ»êÀÌ Å« °ÍÀ» training set·Î ¹èÄ¡ÇÏ°í, »ó´ëÀûÀ¸·Î ºÐ»êÀÌ ÀûÀº °ÍÀº test set À¸·Î ¹èÄ¡ÇÏ´Â ¹æ¹ý(¿©±â¼­ ºÐ»êÀ̶õ, y(t-1), y(t-2), ...., y(t-k)ÀÇ ºÐ»êÀ» ¸»ÇÑ´Ù. k´Â µ¶¸³º¯¼ö ¼ö), µÑ °, ÃÖ±ÙÀÇ µ¥ÀÌÅ͸¦ test setÀ¸·Î ÇÏ°í, ¿À·¡µÈ µ¥ÀÌÅ͸¦ training setÀ¸·Î ÇÏ´Â ¹æ¹ý, ¼Â°, ¹«ÀÛÀ§·Î ¹èÄ¡ÇÏ´Â ¹æ¹ý µî ¼¼ °¡Áö¸¦ °í·ÁÇÏ¿´´Ù. GMDH¿¡¼­´Â ù ¹ø° °ÍÀ» »ç¿ëÇÏ´Â °ÍÀ» ¿øÄ¢À¸·Î ÇÏ¿´´Ù(MNM70, MNI22¸¸ ¹«ÀÛÀ§ ¹èÄ¡¸¦ »ç¿ëÇÏ¿´´Ù.) ´º·² ³Ý¿¡¼­´Â µÎ ¹ø° ¹æ¹ýÀ» »ç¿ëÇÏ¿´´Ù.

ÀÚ·áÀÇ ±¸ºÐ¿¡¼­ test set °ú training setÀÇ ºñÀ²À» Á¤ÇÏ´Â ¹®Á¦°¡ ÀÖ´Ù. ÀÚ·á°¡ ¸¹Àº °æ¿ì(70°³¸¦ ³ÑÀº °æ¿ì)´Â 35%¸¦ test setÀ¸·Î, ÀÚ·á°¡ ÀûÀº °ÍÀº 50% ¶Ç´Â 60%¸¦ test setÀ¸·Î ÇÏ¿´´Ù.

5,2,3 ¼¼´ë±³Ã¼ÀÇ Á¦ÇÑ ¹× µµÅ ±âÁØ

¼¼´ë±³Ã¼°¡ ÁøÇàµÉ¼ö·Ï »ý¼ºµÇ´Â ÀÚ¼ÕÀÌ ´Ã¾î³­´Ù. ¿¹¸¦ µé¾î¼­ µ¶¸³º¯¼ö°¡ 10°³ÀÎ °æ¿ì(½Ã°£Ãà Æ÷ÇÔ), Á¦2¼¼´ë¿¡¼­´Â 10¡¿9 = 45°³ »õ·Î¿î ÀÚ¼ÕÀÌ »ý°Ü³­´Ù. Á¦ 3¼¼´ë¿¡¼­´Â ÃÖ´ë 45¡¿44 = 1980°³ÀÇ ÀÚ¼ÕÀÌ »ý°Ü³­´Ù. ¼¼´ë¸¦ °ÅµìÇÒ¼ö·Ï overfitting Çö»óÀÌ ÀÚÁÖ ³ªÅ¸³µ´Ù. µû¶ó¼­ ÀÚ·áÀÇ °³¼ö¿¡ ºñ·ÊÇؼ­ ¼¼´ë±³Ã¼¸¦ Á¦ÇÑÇÏ¿´´Ù. ÀÚ·á°¡ ÀûÀº °æ¿ì´Â Á¦3¼¼´ë±îÁö Ž»öÇÏ°í, ÀÚ·á°¡ ¸¹Àº °æ¿ì´Â Á¦4¼¼´ë±îÁö·Î Ž»öÀ» Á¦ÇÑÇÏ¿´´Ù.

°°Àº ¼¼´ë ¾È¿¡¼­ Æò°¡±âÁØ °ªÀÌ ±âÁØÄ¡ R°ªÀ» ÃÊ°úÇÏ´Â °ÍÀº µµÅ½ÃÄ×´Ù. ±âÁØÄ¡ RÀº ´ÙÀ½°ú °°ÀÌ °è»êµÇ´Â T ÀÇ ¹è¼ö·Î Á¤ÇÏ¿´´Ù(NÀº µ¥ÀÌÅÍ ·¹ÄÚµå °³¼ö). Á¦2¼¼´ë¿¡´Â 4.5T, Á¦3¼¼´ë´Â 3.5T, Á¦4, 5¼¼´ë´Â °¢°¢ 2.5T, 1.5T °ªÀ» ±âÁØÄ¡ R°ªÀ¸·Î Á¤ÇÏ¿© Àû¿ëÇÏ¿´´Ù.


5.2.4 overfitting ¹æÁö

À̷лó test set°ú training setÀÇ ±¸ºÐÀ¸·Î overfittingÀ» ¹æÁöÇÒ ¼ö ÀÖ´Ù. ±×·¯³ª ¹Ì·¡¸¦ ¿¹ÃøÇÏ´Â µ¥¿¡ training setÀ̳ª test set¿¡ ¾ø¾ú´ø ÆÐÅÏÀÌ ÀÖÀ» ¶§´Â, ±¸ÃàµÈ ¸ðÇüÀÌ ÀÌ»ó¹ÝÀÀÀ» º¸¿©¼­ Àý´ë°ªÀÌ ±Ø´ÜÀûÀ¸·Î Å©°Ô ³ªÅ¸³ª´Â overfitting Çö»óÀÌ »ý±â±âµµ ÇÏ¿´´Ù. <±×¸² 6>Àº MNM6 ÀÚ·á¿¡¼­ µ¶¸³º¯¼ö¸¦ 10°³, test set ºñÁßÀ» 35%À¸·Î ÇÏ¿´À» ¶§ÀÇ ¿¹Ãø±â°£(¿À¸¥ÂÊ 18°³ ±â°£)¿¡¼­ ¿¹Ãø °ªÀÌ ºñÁ¤»óÀûÀ¸·Î Ä¿Áö´Â ÀÌ»ó ¹ÝÀÀÀ» º¸ÀÌ´Â ¿¹ÀÌ´Ù. ±×·¯³ª, 50%À¸·Î ÇÏ¿´À» ¶§¿¡´Â Á¤»óÀûÀÎ ¹ÝÀÀÀ» º¸¿´´Ù.

overfittingÀÇ °æ¿ì¿¡´Â ¿©·¯ °¡Áö ¹æ¹ýÀ¸·Î ¸ðÇü±¸ÃàÀ» ´Ù½Ã ½ÃµµÇÏ¿´´Ù. ù °´Â Ž»ö ¼¼´ë¼ö¸¦ ´õ Àû°Ô ÇÏ´Â ¹æ¹ý, test setÀÇ ºñÁßÀ» ´Ã¸®´Â ¹æ¹ý, ±×¸®°í test set ±¸ºÐ¹æ¹ýÀ» ¹«ÀÛÀ§³ª ÃÖ±ÙÀÇ µ¥ÀÌÅÍ·Î ¹Ù²Ù´Â ¹æ¹ý µîÀ» ½ÃµµÇÏ¿´´Ù.

Âü°í·Î, MNM6 ÀÚ·áÀÇ °æ¿ì´Â °èÀý¼ºÀÌ ¾ÆÁÖ ¶Ñ·ÇÇÑ °æ¿ìÀÌ´Ù. ±×·¯³ª ½ÇÇè¿¡¼­´Â °èÀý¼ºÀ» ¾ø¾ÖÁö ¾Ê°í ÁÖ¾îÁø ÀڷḦ ±×´ë·Î Àû¿ëÇÏ¿´´Ù. ´º·² ³Ýµµ ÀÚ·á¿¡¼­ °èÀý¼ºÀÌ ÀÖ´Â ±×´ë·Î Àû¿ëÇÏ¿´´Ù. °èÀý¼ºÀÌ ¶Ñ·ÇÇÑ °æ¿ì´Â µû·Î ºñ±³Çغ¸¾Ò´Ù.

5.3. ´º·² ³ÝÀÇ Àû¿ë

´º·² ³Ý ¹æ¹ýÀº »ó¾÷¿ë ÆÐÅ°Áö·Î Á¦°øµÇ´Â NeuroShell 2¸¦ ÀÌ¿ëÇÏ¿´´Ù. NeuroShell 2¿¡¼­´Â ´ÙÀ½ ¸î °¡Áö¸¦ »ç¿ëÀÚ°¡ ¼±ÅÃÇϵµ·Ï µÇ¾î ÀÖ´Ù.

5.3.1 ÅõÀÔ ³ëµå ¹× Áß°£ ³ëµå ¼ö °áÁ¤

Back propagation ´º·² ³ÝÀ» Àû¿ëÇÏ´Â µ¥ ÅõÀÔ³ëµå ¼ö¿Í Áß°£³ëµå(hidden node)¸¦ Á¤ÇØ¾ß ÇÏ´Â µ¥, ÅõÀÔ³ëµå´Â GMDH ¹æ¹ý¿¡¼­ »ç¿ëÇÑ ÅõÀÔ³ëµå °³¼ö¿Í °°ÀÌ ÇÏ¿´´Ù. Áß°£³ëµå ¼ö´Â ÅõÀÔ³ëµå ¼öº¸´Ù´Â Àû°Ô Á¤ÇÏ¿´´Ù. ¿ùº° ÀÚ·á¿¡¼­´Â 9-7-1ÀÇ ±¸Á¶(°èÀý¼ºÀÌ ÀÖ´Â °æ¿ì´Â ½Ã°£ÃàÀ» »ç¿ëÇÏ¿© 10-7-1 ±¸Á¶)³ª ºÐ±âº° ÀÚ·á¿¡¼­´Â 7-6-1±¸Á¶ ¶Ç´Â 6-5-1 ±¸Á¶¸¦ ÅÃÇß´Ù.

5.3.2 training set°ú test setÀÇ ±¸ºÐ

NeuroShell 2¿¡¼­´Â 10-40%¸¦ test setÀ¸·Î ÇÒ °ÍÀ¸·Î ÃßõÇÏ°í ÀÖ´Ù. º» ½ÇÇè¿¡¼­´Â ÀÌ ¹üÀ§¿¡¼­ Á¤ÇÏ°í, ÃÖ±ÙÀÇ µ¥ÀÌÅ͸¦ test setÀ¸·Î ÇÏ¿´´Ù.

5.3.3. Stopping rule

´º·² ³ÝÀ» Àû¿ëÇÒ ¶§¿¡ stopping ruleÀº Áß¿äÇÏ´Ù. stopingÀÌ ´ÊÀ¸¸é Áö³ªÄ¡°Ô overfittingµÉ ¼ö ÀÖ°í ³Ê¹« À̸£¸é ÃæºÐÈ÷ learningÀÌ µÇÁö ¾Ê±â ¶§¹®ÀÌ´Ù. ÀûÀýÇÑ ÁߴܽÃÁ¡À» Á¤Çϱâ À§Çؼ­ NeuroShell 2¿¡¼­´Â 'NET-PERFECT'¶ó´Â ±â´ÉÀ» °¡Áö°í ÀÖ´Ù. ÀÌ°ÍÀº training setÀ¸·Î '200ȸ ÇнÀ'ÇÑ ÈÄ(¶Ç´Â ÁöÁ¤ÇÑ È½¼ö¸¸Å­)¿¡ test setÀ» Àû¿ëÇÏ¿© ±×°ÍÀÇ ¿ÀÂ÷(¿ÀÂ÷Àڽ Æò±Õ)°¡ ´õ ÁÙ¾îµé¸é ÇöÀçÀÇ ÇнÀ°á°ú¸¦ ÀúÀåÇÑ´Ù. ¸¸ÀÏ '200ȸ ÇнÀ'À» 20000-40000¹ø ¹Ýº¹Çصµ test set¿¡ ´ëÇÑ '¿ÀÂ÷Àڽ Æò±Õ'ÀÌ ´õ ÁÙ¾îµéÁö ¾ÊÀ¸¸é Áß´ÜÇϵµ·Ï ÃßõÇÏ°í ÀÖ´Ù. º» ½ÇÇè¿¡¼­´Â 100000¹ø ÀÌ»ó ±â´Ù·Áµµ °³¼±µÇÁö ¾ÊÀ¸¸é Áß´ÜÇÏ¿´´Ù. ÆæƼ¿ò III PC¿¡¼­ 2-3ºÐÀ̸é ÃæºÐÇÏ¿´´Ù.

5.3.4 ±âŸ »çÇ×

<±×¸² 7>Àº NeuroShellÀÇ ÇнÀ È­¸éÀÌ´Ù. ´º·² ³ÝÀÇ 'Complexity'´Â 'Very simple'(ÀÌ ¶§¿¡´Â learning rate = 0.6, Momentum = 0.9 À¸·Î ¼³Á¤)À» ¼±ÅÃÇÏ¿´´Ù.

°á°ú°¡ ³ª»Ü ¶§¿¡´Â ´Ù½Ã 'Complex'(ÀÌ ¶§¿¡´Â learning rate = 0.1, Momentum = 0.1 À¸·Î ¼³Á¤)·Î ¹Ù²Ù¾î¼­ ´Ù½Ã ½ÃµµÇØ º¸¾Ò´Ù. ÇнÀ¿¡¼­ ÆÐÅÏÀÇ Àû¿ëÀº 'Rotational'À¸·Î ÇÏ°í, test set¿¡¼­ÀÇ °á°ú°¡ ÁÁÀ» ¶§ ÇнÀÀ» ÀúÀåÇϵµ·Ï ÇÏ¿´´Ù. Net-Perfect IntervalÀº '200'À¸·Î ÇÏ¿´´Ù.

NeuroShell¿¡¼­´Â ÀÚ·á¿¡¼­ ÃÖ´ë°ª°ú ÃÖ¼Ò°ªÀ» ã¾Æ¼­, ±× ÃÖ´ë ÃÖ¼Ò°ªÀÇ +/-(5% ¡­ 10%) À̳»¿¡¼­ º¯µ¿ÇÏ´Â °ÍÀ¸·Î ÇÏ°í, ±×°ÍÀ» [0,1] »çÀÌ·Î °ªÀ¸·Î scaling ÇÏ¿© Àû¿ëÇϵµ·Ï µÇ¾î ÀÖ´Ù. ÀÌ·¯ÇÑ »çÀü󸮰¡ ´º·² ³ÝÀ¸·Î ±¸ÃàµÈ ¸ðÇüÀÇ ÀÌ»ó¹ÝÀÀÀ» ¹æÁöÇØ ÁÖ´Â ±¸½ÇÀ» ÇÏ¿´´Ù. (±×·¯³ª GMDH ÇÁ·Î±×·¡¹Ö °úÁ¤¿¡¼­ ÀÌ·¯ÇÑ ±â´ÉÀ» Ãß°¡ÇÏÁö ¾Ê¾Ò±â ¶§¹®¿¡ ÀÌ»ó¹ÝÀÀÀÌ ÀÚÁÖ ³ªÅ¸³µ´Ù.) °¢ ´º·±¿¡¼­ÀÇ activation functionÀº µðÆúÆ®·Î µÇ¾î ÀÖ´Â sigmoid logistic functionÀ» ±×´ë·Î »ç¿ëÇÏ¿´´Ù.

5.4. ½ÇÇèÀÇ ¿¹

MNM33À» ¿¹½Ã·Î äÅÃÇÏ¿´´Ù. MNM33Àº 80°³ÀÇ ÀÚ·á¿Í Ãß°¡·Î 18°³¿ùÀÇ ¿¹Ãø±â°£µ¿¾ÈÀÇ ÀÚ·á·Î ±¸¼ºµÇ¾î ÀÖ´Ù. MNM33Àº °­ÇÑ °èÀý¼ºÀ» º¸¿©ÁÖ´Â ÀÚ·áÀÌ´Ù. °èÀý¼ºÀ» µû·Î °¨¾ÈÇÏÁö ¾Ê°í ÁÖ¾îÁø ÀÚ·á ±×´ë·Î GMDH¿Í ´º·² ³Ý¿¡ Àû¿ëÇØ º» °á°ú ¾Æ·¡ ±×¸²°ú °°ÀÌ ³ªÅ¸³µ´Ù

±×¸²¿¡¼­ º¸µíÀÌ GMDH°¡ ´º·² ³Ýº¸´Ù ´õ ±ÙÁ¢ÇÏ°í ÀÖ´Ù. GMDHÀÇ MAPE´Â 9.84% ÀÌ°í(<Ç¥ 5>¿¡¼­ b.MAPE(%) ÂüÁ¶), ´º·² ³ÝÀÇ MAPE´Â 25.19%ÀÌ´Ù(<Ç¥ 5>¿¡¼­ c.MAPE(%) ÂüÁ¶). ±×·¯³ª, GMDH¿¡¼­ ¿¹ÃøÄ¡¸¦ ÀçÂ÷ ÅõÀÔÇÏ¿© ¿¹ÃøÇϸé MAPE´Â 21.10%À¸·Î Áß°¡ÇÏ¿´´Ù. MNM33Àº °èÀý¼ºÀÌ °­ÇÏ¿´À¸¹Ç·Î °èÀý¼ºÀ» ¾ø¾Ø ÈÄ GMDH¿Í ´º·² ³ÝÀ» Àû¿ëÇØ º¸¾Ò´Âµ¥, MAPE´Â °¢°¢ 9.93%, 12.8%À¸·Î¼­ ¿©ÀüÈ÷ GMDH°¡ ¾à°£ ÁÁ°Ô ³ª¿Ô´Ù. Âü°í·Î <Ç¥ 4>´Â GMDH¿¡ ÀÇÇÑ MNM33ÀÇ ¿¹Ãø ¸ðÇüÀÇ À̹ٳÙÄÚ °è¼öÀÌ´Ù.

5.5 MAPEÀÇ ºñ±³

¾Æ·¡ <Ç¥ 5>´Â GMDHÀÇ Àû¿ë¹æ¹ý°ú, NN±¸Á¶, ±× ¶§ÀÇ MAPE¸¦ ¿ä¾àÇÑ °ÍÀÌ´Ù.


¿ùº° ÀÚ·á´Â 18°³, ºÐ±âº° ÀÚ·á 9°³ ±â°£À» ¿¹Ãø±â°£À¸·Î ÇÏ¿© ¿¹ÃøÇÏ°í, °¢°¢ÀÇ MAPE¸¦ °è»êÇÏ¿© º¸¾Ò´Ù. ±× °á°ú¿¡ ÀÇÇÑ ¿¡·¯(MAPE)¸¦ °¡Áö°í¼­ Åë°èÀû °ËÁõÀ» ÇÏ¿© º¸¾Ò´Ù. ½ÖÀ» ÀÌ·é T Å×½ºÆ®¸¦ Çغ» °á°ú, ¿¹Ãø±â Àü±â(îñÑ¢)ÀÇ ½ÇÁ¦°ªÀ» ÅõÀÔÇÏ¿© ¿¹ÃøÇÑ °á°ú´Â 10% À¯ÀǼöÁØ¿¡¼­ GMDH ¹æ¹ý¿¡ ÀÇÇÑ MAPE°¡ ´º·² ³ÝÀÇ ±×°Íº¸´Ù Àû´Ù´Â °á·ÐÀ» ÁÖ¾ú´Ù(<Ç¥ >¿¡¼­ ¿À¸¥ÂÊ, p-value = 0.096). µû¶ó¼­, GMDH°¡ ´º·² ³Ý¿¡ ºñÇÏ¿© ¿ì¼öÇÏ´Ù°í ¸» ÇÒ ¼ö ÀÖ´Ù. ±×·¯³ª, ¿¹ÃøÄ¡¸¦ ÀçÂ÷ ÅõÀÔÇÑ GMDHÀÇ MAPE¿Í ½ÇÁ¦°ªÀ» ÅõÀÔÇÑ ´º·² ³ÝÀÇ MAPE¸¦ ºñ±³Çؼ­´Â Â÷ÀÌ°¡ ÀÖ´Ù´Â °á·ÐÀ» ÁÖÁö ¸øÇß´Ù(<Ç¥ >¿¡¼­ ¿ÞÂÊ, p-value = 0.708).

ÇÑÆí °èÀý¼ºÀÌ °­ÇÑ µ¥ÀÌÅÍ 5°³¸¦ °¡Áö°í¼­, °èÀý¼ºÀ» ¾ø¾Ø ÈÄ ºñ±³ÇØ º¸¾Ò´Ù. ±× °á°úµµ GMDH°¡ ´Ù¼Ò ¿ì¼öÇÏ°Ô ³ªÅ¸³µ´Ù. ½ÖÀ» ÀÌ·é T-Å×½ºÆ®¸¦ ¹èº» °á°ú, GMDH ¹æ¹ýÀÌ ¿ì¼öÇÏ°Ô ³ªÅ¸³µ´Ù(p-value = 0.043)


VI. °á·Ð

º» ³í¹®¿¡¼­´Â ½Ã°è¿­ ¿¹Ãø¿¡ GMDH ¹æ¹ýÀ» Àû¿ëÇÒ ¼ö ÀÖÀ½À» º¸¿´´Ù. ±âÁ¸ ¿¬±¸¿¡¼­´Â ½Ã°è¿­ ¿¹Ãø¿¡¼­ ´º·² ³ÝÀÇ ÀÀ¿ëÀÌ °¡´ÉÇÏ°í ±× ¿ì¼öÇÔÀ» º¸¿´´Ù. ´º·² ³Ý ¸ðÇüÀ¸·Î ±¸ÃàµÈ °ÍÀº GMDH ¸ðÇüÀ¸·Îµµ ±¸ÃàÇÒ ¼ö Àֱ⠶§¹®¿¡ GMDH·Îµµ ½Ã°è¿­ ¿¹ÃøÀÌ °¡´ÉÇÏ´Ù´Â ÀÌ·ÐÀû ÃßÃøÀ» ½ÇÁ¦·Î ½ÇÇèÇÏ¿´´Ù. ±× °á°ú ´º·² ³Ý ¹æ¹ý ¸øÁö ¾Ê°Ô »ç¿ëÀÌ °¡´ÉÇÔÀ» ¾Ë ¼ö ÀÖ¾ú´Ù. ƯÈ÷, ¿¹Ãø±â°£ µ¿¾ÈÀÇ ½ÇÁ¦ µ¥ÀÌÅ͸¦ ÅõÀÔÇÏ¿© Â÷±âÀÇ °ªÀ» ¿¹ÃøÇÑ °æ¿ìÀÇ MAPE ¿¡·¯¸¦ ºñ±³ÇØ º¸¸é, ½ÖÀ» ÀÌ·é T-Å×½ºÆ®¿¡¼­ GMDH ¹æ¹ý¿¡ ÀÇÇÑ ¿¡·¯°¡ ´º·² ³Ý ¹æ¹ý¿¡ ÀÇÇÑ ¿¡·¯º¸´Ù 10% À¯ÀǼöÁØ¿¡¼­ Àû´Ù´Â °ÍÀ» ¾Ë ¼ö ÀÖ¾ú´Ù. ÀÌ°ÍÀ¸·Î ½Ã°è¿­ ¿¹Ãø¿¡¼­ GMDH ¹æ¹ýÀÌ ´º·² ³Ý ¹æ¹ýº¸´Ù Ç×»ó ¿ì¼öÇÏ´Ù°í °á·ÐÀ» Áö¿ï ¼ö´Â ¾øÁö¸¶´Â, GMDH ¹æ¹ýÀÌ ½Ã°è¿­ ¿¹Ãø¿¡¼­ ±âÁ¸ÀÇ Åë°èÀû ¹æ¹ýÀ̳ª ´º·² ³Ý ¸øÁö ¾Ê°Ô È°¿ëÇÒ ¼ö ÀÖÀ½À» ¾Ë ¼ö ÀÖ´Ù.

GMDHÀÇ ÀåÁ¡Àº ´º·² ³ÝÀÌ ÁÙ ¼ö ¾ø´Â ±¸Ã¼ÀûÀÎ ÇÔ¼ö Ç¥ÇöÀ» ÁÙ ¼ö ÀÖ´Ù´Â Á¡ÀÌ´Ù. µû¶ó¼­, ´º·² ³ÝÀ¸·Î ¸ðÇüÀ» ±¸ÃàÇÑ °æ¿ì¿¡´Â GMDH¸¦ »ç¿ëÇÏ¿© ±×°ÍÀ» ÇÔ¼ö½ÄÀ¸·Î Ç¥ÇöÇÒ ¼ö ÀÖ´Ù. ¶Ç, ´º·² ³Ý¿¡¼­´Â ÅõÀÔÀÚ·áÀÇ °ªÀ» ¸ðµÎ ¾Ë¾Æ¾ß ÇÏ´Â ¹Ý¸é, GMDH¿¡¼­´Â äÅÃµÈ ÅõÀÔº¯¼öÀÇ °ª¸¸ ¾Ë¸é µÇ¹Ç·Î ÀڷḦ ´õ¿í °æÁ¦ÀûÀ¸·Î È°¿ëÇÑ´Ù´Â Á¡µµ ÀåÁ¡ÀÌ µÈ´Ù. ´º·² ³ÝÀ» Àû¿ëÇϱâ Àü¿¡ GMDH¸¦ ÀÌ¿ëÇÏ¿© »çÀü Å×½ºÆ®¸¦ ÇØ º¼ ¼ö ÀÖ´Ù´Â ÀåÁ¡µµ Å©´Ù.

´º·² ³Ý¿¡ ºñÇÏ¿© GMDH·Î ±¸ÃàµÈ ¸ðÇüÀº ÁÖ¾îÁö´Â ÀÚ·á¿¡ ´ëÇÏ¿© ¹Î°¨ÇÏ°Ô ¹ÝÀÀÇÏ¿© ¾û¶×ÇÑ °á°ú¸¦ ÁÙ ¼ö ÀÖ´Â À§ÇèÀÌ ÀÖ´Ù. ÇѸ¶µð·Î, GMDH´Â ´º·² ³Ý¿¡ ºñÇÏ¿© ÀûÀº ¼öÀÇ ÅõÀÔº¯¼ö·Î ´õ Á¤±³ÇÑ ¿¹Ãø¸ðÇüÀ» ±¸ÃàÇÏÁö¸¸, ±úÁö±â ½¬¿î ´ÜÁ¡ÀÌ ÀÖ´Ù. ±×·¯³ª ´º·² ³ÝÀ̳ª ±âÁ¸ÀÇ Åë°èÀû ¹æ¹ýÀ» Àû¿ëÇϱâ Àü¿¡ º¥Ä¡¸¶Å·À¸·Î¼­ ÈǸ¢ÇÑ ¿ªÇÒÀ» ÇÒ ¼ö ÀÖ´Ù.

º» ¿¬±¸¿¡¼­ ºñ±³ÇÑ °Íº¸´Ù´Â ´õ ¸¹Àº ÄÉÀ̽º¿¡¼­ ´º·² ³Ý°ú GMDH¹æ¹ýÀ» ºñ±³ÇÒ ÇÊ¿ä°¡ ÀÖ´Ù. º» ¿¬±¸¿¡¼­´Â ÃæºÐÇÑ ÄÉÀ̽º¸¦ ´Ù·ç¾î¼­ ºñ±³ÇØ º¸Áö ¸øÇÏ¿´Áö¸¸, GMDH ¹æ¹ýµµ ½Ã°è¿­ ¿¹Ãø¿¡¼­ ´º·² ³Ý ¹æ¹ý ¸øÁö ¾Ê°Ô À¯¿ëÇÔÀ» È®ÀÎÇÒ ¼ö ÀÖ¾ú°í, ±âÁ¸ÀÇ Åë°èÀû ¹æ¹ý°úµµ º´ÇàÇؼ­ È°¿ëÇÒ ¸¸ÇÔÀ» °£Á¢ÀûÀ¸·Î º¸¿´´Ù.

Âü°í¹®Çå

1. Chryssolouris, G., and M, Guillot 1990, "A Comparison of statistical and AI approaches to the selection of process parameters in intelligent machining", Transactions of the ASME, Journal of Engineering for Industry, 112(2), 122-131.

2. Farlow, S.J., (Ed.), 1984, Self-organizing method in modellig. USA. Marcel Decker.

3. Foster, B., F. Collopy, and L. Ungar, "Neural Netsork Forecasting of Short, Noisy Time Series," Computers and Chemical Engineering, 16, 12, 1992

4. Funahashi, K., "On the approximate Realization of Continuous Mappings by Neural Net," Neural Networks, 2 (1989), 183-192

5. Hill, T., M. O'Connor, and W. Remus, "Neural Network Models for Time Series Forecasts," Management Science, Vol. 42, No.7, July, 1996

6. Ikeda, Saburo, 1984, "Nonlinear Prediction Models for River Flows and Typhoon Prediction By Self-Organizong Methods," in Self-organizing method in modellig.(Farlow, S.J., (Ed.)) 1984, USA. Marcel Decker.

7. Madala, Hema R., Alex G. Ivakhnenko, Inductive Learning Algorithmd for Complex Systems Modeling, CRC press, 1994

8. Makaridakis, S, A., A. Anderson, et al., " The accuracy of Extrapolation( Time Series) Methods: Results of Forecasting Competition, " J. Forecasting, 1 1982

9. Nagasaka, K., Y. Kita, F. Hashimoto, " Identification of a model of grinding wheel life by group method of data handling", Wear, 58, 1980, 147-154.

10. Ravindra, H.V., M. Raghunandan, Y.G. Srinivasa, and R. Krishnamurthy, " Tool wear estimation by group method of data handling in turning," INT. J. Production Research, 1994 Vol. 32, No. 6, 1295-1312.

11. Scott, D. E., C. E. Hutchinson, "An Application of the GMDH Algorithm to Economic Modeling," in Self-organizing method in modellig.(Farlow, S.J., (Ed.)) 1984 USA. Marcel Decker.

12. Shadra, R., R. Patil, "Neural Networks as Forecasting Experts: An Empirical Test, "Proc. 1990 IJCNN Merting, 2 (1990), 491-494

13. White, H., "Connectionist Nonparametric Regression: Multilayer Feedforward Networks Can Learn Arbitrary Mappings," in H. White(Ed.), Artificial Neural Networks: Approximations and Learning Theory, Blackwell, Oxford, UK, 1992a

14. White, H., "Consequences and Detection of Nonlinear Regression Models," in H. White(Ed.), Artificial Neural Networks: Approximations and Learning Theory, Blackwell, Oxford, UK, 1992b

15. Yashida, T., K. Nagasaka, Y. Kita, F. Hashimoto, " Identification of a grinding wheel wear equation of the abrasive cut-off by the modified GMDH," International Journal of Machine Tool Design and Research, 26(3), 1986, 283-292.

16. Ohashi, Kenichi, 1984, "GMDHH Forecasting of U.S. Interest Ratess," in Self-organizing method in modellig. (Farlow, S.J., (Ed.)) USA. Marcel Decker.

<Abstract>

A Comparison of Neural Networks and GMDH in Time Series Forecasts

Hansik Song

This study showed that GMDH(group method of data handling) is good enough to use in time series forecasts.

Hill et al.(1996) reported that Neural Networks forecasts as good as or better than statistical methods. Since Neural Net and GMDH are both self-organizing method in systems modelling we can conjecture that GMDH is also as good as or better than statistical method in time series forecasts. Fron this reason, instead of comparing GMDH with statistical method, this study compared GMDH and Neural Networks method.

A Macro Basic program was developed for GMDH and used for the comparison. NeuroShell 2.0 program by Ward Systems was used for neural net method. Before the comparison, both methods were tested to find the degree number of freedom in distribution by using values as an exact data case test. GMDH found degree numbers very accurately through all the range. But Nerual Net model made it with growing errors as the degree numbers get larger.

24 time series data from "M-competition" were used for the comparison. Statistical test showed that the MAPE's by GMDH forecasts were less than that by Neural Net method when actual data were input to forecast the next period value. And in the sampled cases having seasonal fluctuations, GMDH was better than Neural Net method.

GMDH has an advantage that it can express a forecasting polynomial explicitly. This advantage is useful if we use GMDH in time series forecasting with other method.