This seems to be a popular label to use among the real serious Christians out there. Is anyone else out there struck by the irony of this phrase?
Religion is neither good, nor does it feel good... But let's assume that feel-good religion did in fact exist and was in fact very wrong and dangerous. Does that mean that we should be preaching and living feel-bad religion?