Why We Should Always Tell The Truth.
Image by OpenClipart-Vectors
Truth to me simply means saying or stating the fact as it is, no sugarcoating, no filtering, not down playing, or withholding the fact's about what really happened, just because you want to avoid the consequences that comes with your actions you took.
When you don't tell the truth, because you feel you are protecting the person or avoiding the outcome of what you did, it's only end one of two ways after lying.
Either you keep on lying to cover up previous Lies, which will surely lead to more lies until the truth comes out anyways or you tell a lie and get caught, then you lose your self respect, the trust people had for you.
Instead of going through with the stress of coming up with lies and more lies, or lose your integrity, just say the truth and get in over with, I know to say the truth at times is hard, but is it worth the sleepless nights, discomfort around who you have lied too, Running when no one is chasing after you, because your mind is judging you.
There is a popular quote that goes like this:
There is nothing hidden under the sun
Which means the truth will surely come out, no matter how long it takes.
In Do you want equality or do you want special treatment? by @dwinblood he had this to say about telling the truth:
The truth is that there are still people in every discipline who care more about the truth than anything else. Even if that truth is uncomfortable.