A version of the Naive Bayes Classifier that incorporates various feature distributions is definitely possible to build, and there's nothing nonsensical about combining categorical and ordered (potentially non-Gaussian) features. While both the suggestions in the article you linked to are feasible and discretization is common, neither is necessary. One can implement a variation of the Naive Bayes Classifier that simultaneously supports categorical and ordered features without use of those tricks.
However, when dealing with continuous features data scientists typically use other classifiers such as the Support Vector Machine or Logistic Regression model instead of the Naive Bayes Classifier, which was initially designed to handle categorical features. This explains why you've had trouble finding what you're looking for.
A while ago I found myself in the same boat as you, and ended up making a project out of filling the online void for generalized Naive Bayes Classifier implementations. I compiled a mathematical justification for the simultaneous use of categorical, discrete and arbitrarily distributed continuous features with the Naive Bayes Classifier as part of a paper I wrote for that project. I pasted the justification below.
In case you're looking for an implementation of such a model, my Python implementation of the Naive Bayes Classifier based on the above math is on github here.