Will you tell the truth even if it will hurt and not contribute to anything positive for anyone involved ? Or will you lie and keep everything the way it is ? tl;dr Is the truth always the most important thing?
If the truth doesn't really have a positive effect at all, then what use is it really except for you to say that you told the truth?
Anyways I don't know if it's the most important thing, but I believe it always has to be handled with at least some tact. Some people who tell the harsh truths are really just assholes who want to offend and emotionally devastate other people, so if that's the intention, would telling the truth really be commendable?
Anyways I do believe that all truths, if told in the right way (I don't mean sugarcoated), could benefit people in one way or the other, while other truths are better left expressed through actions.
Anyways I don't know if it's the most important thing, but I believe it always has to be handled with at least some tact. Some people who tell the harsh truths are really just assholes who want to offend and emotionally devastate other people, so if that's the intention, would telling the truth really be commendable?
Anyways I do believe that all truths, if told in the right way (I don't mean sugarcoated), could benefit people in one way or the other, while other truths are better left expressed through actions.
Liked by:
Red Casas
Aldrich Asuncion
Earl Del Rosario