Based on what the title says, i can tell that lies are always wrong. When someone lie and you find out, imediatly you loose faith in that person and you can't trust them anymore. The truth will always come to the light and just a simple lie can make someone see you in a different way. Telling lies won't help you dealing with troubles, it could make it worse so that's why it's good to tell the truth when you get the chance. A lie can be use against you.