Just a general question for anyone in any country with a government. With all events in our history taken into consideration, and what we know and are learning in our Present about our Past, do you believe that the government always tells the truth? Why or why not? My government is of the USA, and my own opinion is 'No'. They've grossly lied about major events where lives have been lost in the past, so there's no reason to believe they're not still lying today and will continue to if it suits their agendas. Add: Given that there are some things that don't need to be known to the public and may protect national interests, and shouldn't be disclosed; do you believe they always tell us the truth outside these conditions?