Presentation SICEAMS 2024
Presentation SICEAMS 2024
Assistant Professor
Department of Mathematics
Brainware University, Barasat
West Bengal, India
SICEAMS 2024
Department of Mathematics
University of Gour Banga
Malda 732103, West Bengal, India
Introduction Objective Methodology Results Data Analysis Conclusion
Outline
1 Introduction
2 Objective
3 Methodology
4 Results
5 Data Analysis
6 Conclusion
Introduction Objective Methodology Results Data Analysis Conclusion
Y = β1 X1 + . . . + βp Xp + g (W1 , . . . , Wq ) + ϵ
The assumptions on ϵ are
(i) E (ϵ|X, W) = 0, (ii) E (ϵ2 |X, W) = σ 2 (X, W)(> 0) where
(X, W) = (X1 , . . . , Xp , W1 , . . . , Wq ).
The p parametric regressors X1 , . . . , Xp study the response
variable Y through the linear function β1 X1 + . . . + βp Xp .
The q nonparametric regressors W1 , . . . , Wq explain Y
through an unknown regression function g (W1 , . . . , Wq )
which assumed as a Lipschitz continuous one.
The parameters β1 , . . . , βp are usually estimated by Robinson
(1988)’s method. The nonparametric regression function
g (W1 , . . . , Wq ) is usually estimated by kernel density
estimation technique(s).
Introduction Objective Methodology Results Data Analysis Conclusion
Outline
1 Introduction
2 Objective
3 Methodology
4 Results
5 Data Analysis
6 Conclusion
Introduction Objective Methodology Results Data Analysis Conclusion
H0 : (X, W) ⊥
⊥ ϵ against H1 : (X, W) ̸⊥⊥ ϵ
We perform statistical test of independence between (X, W)
and ϵ next.
We shall consider Spearman’s ρs and Kendall’s τ to furnish
the testing of hypothesis further.
Introduction Objective Methodology Results Data Analysis Conclusion
Outline
1 Introduction
2 Objective
3 Methodology
4 Results
5 Data Analysis
6 Conclusion
Introduction Objective Methodology Results Data Analysis Conclusion
Estimation of β and m
By Robinson (1988)’s method, β is estimated as
−1
T
β̂ = ϵ̂XW ϵ̂XW (ϵ̂XW ϵ̂YW ).
where ϵ̂YW = Y − ĝY (W) and ϵ̂XW = X − ĝX (W ) and
n q
1 XnY 1 wj − Wij o
kj Yi
n hj hj
i=1 j=1
ĝY (W) = n q ,
1 XnY 1 wj − Wij o
kj
n hj hj
i=1 j=1
n p q
1 Xn Y onY 1 wj − Wij o
Xim kj
n hj hj
i=1 m=1 j=1
ĝX (W) = n q .
1 XnY 1 wj − Wij o
kj
n hj hj
i=1 j=1
Introduction Objective Methodology Results Data Analysis Conclusion
Construction of hypotheses
Finally, H0 : Ŷ ∗ (r ) ⊥
⊥ Y ∗ (r ) and H1 : Ŷ ∗ (r ) ̸⊥⊥ Y ∗ (r ).
The contiguous sequence alternative hypotheses, using Le
Cam (1960)[Le Cam (1960)]’s first lemma, is as follows
Test Statistics
Under H0 ,
√ L
n(Tn<r > − EH0 (Tn<r > )) −→ N(0, 4ξ1 (r )), provided
E [h2 ((Ŷ1∗ (r ), Y1∗ (r )), (Ŷ2∗ (r ), Y2∗ (r )))] < ∞,
where
ξ1 (r ) = Var [E (h((Ŷ1∗ (r ), Y1∗ (r )), (Ŷ2∗ (r ), Y2∗ (r ))) (Ŷ1∗ (r ), Y1∗ (r )))
Under Hn ,
√ L
n(Tn<r > − EH0 (Tn<r > )) −→ N(Υ(r ) , 4ξ1 (r )), where
!
(r ) √ d G̃n
Υ = lim CovH0 n(Tn<r > − EH0 (Tn<r > )), log .
n→∞ dG0
Introduction Objective Methodology Results Data Analysis Conclusion
where D > 0.
(r )
AARE µ (Vn ) ↑ as µ ↑, for n → ∞ and is independent of r .
0.8
AARE (Vn)
0.4
D = 0.225
D = 0.575
D = 0.85
D=1
0.0
0 5 10 15 20 25 30
Outline
1 Introduction
2 Objective
3 Methodology
4 Results
5 Data Analysis
6 Conclusion
Introduction Objective Methodology Results Data Analysis Conclusion
The asymptotic power curves of Sn<r > and Tn<r > against µ
are depicted as follows.
1.0
1.0
0.8
0.8
Power_(ρs)
Power_(τ)
0.6
0.6
r=2 r=2
0.4
0.4
r=3 r=3
r=4 r=4
0.2
0.2
r=5 r=5
r=10 r=10
0.0
0.0
0 5 10 15 20 25 30 0 5 10 15 20 25 30
µ µ
Outline
1 Introduction
2 Objective
3 Methodology
4 Results
5 Data Analysis
6 Conclusion
Introduction Objective Methodology Results Data Analysis Conclusion
4
2
2
acidity
acidity
0
0
−4 −2
−4 −2
−4 −2 0 2 4 −4 −2 0 2 4 6
juiciness ripeness
Introduction Objective Methodology Results Data Analysis Conclusion
4
2
2
acidity
acidity
0
0
−4 −2
−4 −2
−4 −2 0 2 4 −6 −4 −2 0 2
size weight
4
4
2
2
acidity
acidity
0
0
−4 −2
−4 −2
−4 −2 0 2 4 −2 0 2 4 6 8
sweetness crunchiness
Introduction Objective Methodology Results Data Analysis Conclusion
Outline
1 Introduction
2 Objective
3 Methodology
4 Results
5 Data Analysis
6 Conclusion
Introduction Objective Methodology Results Data Analysis Conclusion
References
References
Dhar, S. S., Dassios, A., and Bergsma, W. (2018). Testing
Independence of Covariates and Errors in Nonparametric
regression. Scandinavian Journal of Statistics, 45, 421-443.
Hamilton, S. A., & Truong, Y. K. (1997). Local linear estimation in
partly linear models. Journal of Multivariate Analysis, 60(1), 1-19.
Hajek, J., Sidak, Z., and Sen, P. K. (1999). Theory of Rank Tests.
Academic Press.
Lévy, P. (1939). Sur la division d’un segment par des points choisis
au hasard. CR Acad. Sci. Paris, 208, 147-149.
Li, Q. (2000). Efficient estimation of additive partially linear
models. International Economic Review, 41(4), 1073-1092.
Liu, Z., Liu, Z., Lu, X., & Lu, X. (1997). Root-n-consistent
semiparametric estimation of partially linear models based on k-nn
method. Econometric Reviews, 16(4), 411-420.
Introduction Objective Methodology Results Data Analysis Conclusion
References
References
Lévy, P. (1939). Sur la division d’un segment par des points choisis
au hasard. CR Acad. Sci. Paris, 208, 147-149.
Li, Q. (2000). Efficient estimation of additive partially linear
models. International Economic Review, 41(4), 1073-1092.
Robinson, P. M. (1988). Root-N-consistent semiparametric
regression. Econometrica, 56(4), 931-954.
Van der Vaart, A. (2002). The statistical work of Lucien Le Cam.
The Annals of Statistics, 30(3), 631-682.
Wang, L., Brown, L. D., & Cai, T. T. (2011). A difference based
approach to the semiparametric partial linear model.
Zhou, Z., Mentch, L., & Hooker, G. (2021). V-statistics and
variance estimation. Journal of Machine Learning Research,
22(287), 1-48.
Introduction Objective Methodology Results Data Analysis Conclusion
Thank You