Does it bother you life lies to you all the time without you knowing it?
Governments, Magazines, Photos, Music, Movies, Family, Friends, even yourself!
They may be small lies like the editing in a photo to make a person look better, or
your parents saying small lies to make things not seem so horrible or make it look even better then it seems. (Vegetables, trips, shots ect)
To the big lies of the government with holding info to “protect us” and making us hate things that are good for us and love things that kill us, and the Media saying we need to fallow these trends and buy these things and look this way to be truly “happy”
We are a race that treats every other race like a lower “pet” or “insect” and we
don’t even know what else is out there (space) let alone down here (Earth: caves, mountains, oceans, forests, jungles, or even our selfs)
Im just wondering if anyone else is done with the lies, and want a world that helps accept us all for who we are? Also to help those who need help and hear what the world has to offer?…. With out judging first.