America in the 20th century has suffered from many violent incidents which can be traced back to Christianity. Some have been organized, others not so organized, but all the result of specifically violent or dangerous doctrines promoted in Christian churches. Often the hatred and violence circles around particular issues, like homosexuality or abortion
Is Christianity only a religion of Peace and Love? I do not think that anyone can honestly and objectively examine these situations in American society and answer "yes" to that question. Christianity
can encourage Peace and Love - but it certainly need not, and it quite often has done just the opposite.
Although the people responsible for violence might have found a way to express their hatred without Christianity, it cannot be ignored that Christianity offers a convenient divine mandate for hatred and violent acts against a wide range of people in our diverse country. Christians are going to have to seriously rethink and revamp their faith if they are going to join in building a prosperous, pluralistic America of the future.