objectif
Concaténer des transformateurs en parallèle.
code minimal
from sklearn.pipeline import FeatureUnion, Pipeline
from sklearn.preprocessing import StandardScaler, PolynomialFeatures
from sklearn.linear_model import LinearRegression
from sklearn.datasets import make_regression
X, y = make_regression(n_samples=50, n_features=1, noise=0.1, random_state=0)
fu = FeatureUnion([("poly", PolynomialFeatures(2)), ("id", StandardScaler(with_mean=False))])
pipe = Pipeline([("feat", fu), ("lr", LinearRegression())]).fit(X, y)
print(hasattr(pipe, "predict"))
utilisation
from sklearn.pipeline import make_union
print(hasattr(make_union(StandardScaler(with_mean=False), PolynomialFeatures()), "fit"))
variante(s) utile(s)
from sklearn.preprocessing import FunctionTransformer
print(hasattr(FunctionTransformer(lambda X: X), "fit"))
notes
- S’assurer que transformateurs sortent des matrices compatibles.