I feel myself wanting to agree with you, then a little voice asks: When exactly were Evangelicals “moral”? What does that even mean?—moral?
I grew up in a clannish and authoritarian environment in which I don’t believe anyone ever really was interested in me or loved me—and this includes my family. The idea of “morality” was all.
The morality you seem to praise was used against many people as a weapon. Get a divorce—which is authorized in the Bible itself—and you’re out. Defamed, gossiped about. Gays, everyone with sexual difference—were warred against by their communities and their own families.
Freedom of speech was prohibited, talk of sex was basically illegal. Right-wing politics were all but required.
Thought experiment: if you had to choose between that and Trumpism—which do you go with?