Well I got the solution myself, but I'll keep the question posted, maybe somebody faces the same problem.
The Regularization term strength Lambda was default way to low to "kick out" any of my features. (default Lamda=1/n, with n beeing the training sample size). By setting it manually a lot higher I got the result I expected to get initial: The binary models parameter beta gets a lot more zeros when increasing the parameter Lambda, which suggests, that this feature is not preffered by the lasso regularization.
Now I can get an overview about all the binary learners to see if there are some general preffered or "kicked of" features.
t=templateLinear('Regularization','lasso', 'Lambda', 0.1);
This Lambda works fine for me, but this does not mean it works the same on a different data set.
Have a nice day! :)