@@ -125,7 +125,7 @@ To install from source, see [For Developers](#for-developers) section below.
125
125
### Estimation Methods
126
126
127
127
<details >
128
- <summary >< a href = " #references " > Double Machine Learning</ a > (click to expand)</summary >
128
+ <summary >Double Machine Learning (click to expand)</summary >
129
129
130
130
* Linear final stage
131
131
@@ -170,10 +170,11 @@ To install from source, see [For Developers](#for-developers) section below.
170
170
# Confidence intervals via Bootstrap-of-Little-Bags for forests
171
171
lb, ub = est.effect_interval(X_test, alpha = 0.05 )
172
172
```
173
+
173
174
</details >
174
175
175
176
<details >
176
- <summary >< a href = " #references " > Orthogonal Random Forests</ a > (click to expand)</summary >
177
+ <summary >Orthogonal Random Forests (click to expand)</summary >
177
178
178
179
``` Python
179
180
from econml.ortho_forest import ContinuousTreatmentOrthoForest
@@ -195,7 +196,7 @@ To install from source, see [For Developers](#for-developers) section below.
195
196
196
197
<details >
197
198
198
- <summary >< a href = " #references " > Meta-Learners</ a > (click to expand)</summary >
199
+ <summary >Meta-Learners (click to expand)</summary >
199
200
200
201
* XLearner
201
202
@@ -239,7 +240,7 @@ To install from source, see [For Developers](#for-developers) section below.
239
240
</details >
240
241
241
242
<details >
242
- <summary >< a href = " #references " > Doubly Robust Learners</ a > (click to expand)
243
+ <summary >Doubly Robust Learners (click to expand)
243
244
</summary >
244
245
245
246
* Linear final stage
@@ -283,24 +284,7 @@ lb, ub = est.effect_interval(X_test, alpha=0.05)
283
284
</details >
284
285
285
286
<details >
286
- <summary ><a href =" #references " >Orthogonal Instrumental Variables</a > (click to expand)</summary >
287
-
288
- * Double Machine Learning IV
289
-
290
- ``` Python
291
- from econml.ortho_iv import DMLIV
292
- from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier
293
- from sklearn.linear_model import LinearRegression
294
-
295
- est = DMLIV(model_Y_X = GradientBoostingRegressor(),
296
- model_T_X = GradientBoostingClassifier(),
297
- model_T_XZ = GradientBoostingClassifier(),
298
- model_final = LinearRegression(),
299
- discrete_instrument = False ,
300
- discrete_treatment = True )
301
- est.fit(Y, T, Z, X)
302
- treatment_effects = est.effect(X_test)
303
- ```
287
+ <summary >Orthogonal Instrumental Variables (click to expand)</summary >
304
288
305
289
* Intent to Treat Doubly Robust Learner (discrete instrument, discrete treatment)
306
290
@@ -320,7 +304,7 @@ lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals
320
304
</details >
321
305
322
306
<details >
323
- <summary >< a href = " #references " > Deep Instrumental Variables</ a > (click to expand)</summary >
307
+ <summary >Deep Instrumental Variables (click to expand)</summary >
324
308
325
309
``` Python
326
310
import keras
@@ -349,6 +333,8 @@ treatment_effects = est.effect(X_test)
349
333
```
350
334
</details >
351
335
336
+ See the <a href =" #references " >References</a > section for more details.
337
+
352
338
### Interpretability
353
339
* Tree Interpreter of the CATE model
354
340
``` Python
@@ -467,6 +453,10 @@ M. Oprescu, V. Syrgkanis and Z. S. Wu.
467
453
** Orthogonal Random Forest for Causal Inference.**
468
454
[ * Proceedings of the 36th International Conference on Machine Learning (ICML)* ] ( http://proceedings.mlr.press/v97/oprescu19a.html ) , 2019.
469
455
456
+ S. Künzel, J. Sekhon, J. Bickel and B. Yu.
457
+ ** Metalearners for estimating heterogeneous treatment effects using machine learning.**
458
+ [ * Proceedings of the national academy of sciences, 116(10), 4156-4165* ] ( https://www.pnas.org/content/116/10/4156 ) , 2019.
459
+
470
460
V. Chernozhukov, D. Nekipelov, V. Semenova, V. Syrgkanis.
471
461
** Plug-in Regularized Estimation of High-Dimensional Parameters in Nonlinear Semiparametric Models.**
472
462
[ * Arxiv preprint arxiv:1806.04823* ] ( https://arxiv.org/abs/1806.04823 ) , 2018.
0 commit comments