セツ ギョクケツ (セツ ギョクケツ)

写真a

所属

理工学術院 理工学術院総合研究所

職名

次席研究員(研究院講師)

 

特定課題研究 【 表示 / 非表示

  • Model Selection of local linear regression with depend disturbances for large datasets

    2019年  

     概要を見る

    It is believed that the more information obtained, the more suitable decision we can make. However, it is based on the fundament that the ability dealing with large datasets is increased. This research provides an approach to find an approximate relationship between large datasets when the relationship variates with respect to time which can be more much complicated than linear relationship (i.e., highly nonlinear), such that the government and enterprises can have an accurate overview. In this research, the nonparametric method known as local linear regression is used in high dimensional problems with LASSO, which helps us shrinkage the variables and has been widely used as a method of model selection, and the case with depend disturbances is considered. It is shown that, when the dimension of parameter is fixed, as the sample size increases, and bandwidth decreases, under certain conditaions, the bias and variance of the estimation converges to 0. When the dimension of parameter increases as sample size increases, under certain conditions, the probability of right selection goes to 1.

  • Model Selection for Quantile Regression and its Applications

    2018年  

     概要を見る

    The least absolute shrinkage and selection operator (LASSO) is proposed by Tibishilani (1996) and is a popular technique for model selection and estimation in linear regression models. It has been shown that the correct subset of relevant variables can be selected efficiently. So far, literature on LASSO has mainly focused on short-memory dependent disturbances or variables which means the covariances of a discrete time stationary stochastic sequence of disturbances or variables decreases to zero as the lag tends to infinity and their absolute sum converges. However, in the fields of hydrology, economics and other sciences, the long-memory sequences arise, which means the absolute sum of covariances diverges compared to short-memory sequences. Thus, this research applies LASSO to the linear quantile regression model with long-memory disturbances. When the dimension of parameters is fixed, the asymptotic distribution of the modified LASSO estimators is derived under certain natural regularity conditions. Furthermore, when the dimension of parameters increases with respect to observation length , the consistency on the probability of correct selection of relevant variables is shown. It is shown that under certain regularity conditions, the probability of correct selection converges to   as the observation length  goes to infinity.