I feel like I really want to say some deep and meaningful shit… but I don’t know if I really have the words (hmm yeah not a great claim from someone who occasionally identifies as a writer!).
But seriously – WTF is going on with the world these days?? We are meant to be moving into the future but instead it’s like we’re slowly slipping back into the dark ages – yes America I AM looking at you!
Who the hell has the right to tell ANYONE what they can do with their own bodies? How dare you claim to care about life when the only life you actually care about is before it’s even begun? How is the life of basically a few cells more important than the life of an actual grown woman, or, sadly on too many occasions, a young girl?
I know that your religion might say something is “wrong” – the book you like to read may suggest that being gay is somehow evil…. Well you know what? I’ve read a good few books where describes it with a more honest and real representation of what that is – an expression of love.
Yes I know I am jumping about between different topics but, let’s face it…. denying human rights to one group of people is only ever the first step… who’s next?
And the thing I don’t understand, hand on heart really DO NOT understand… How does what someone else does impact on your life AT ALL? If two men, or two women, are in love – what does that take away from your life? If someone chooses to be called they/them – how does that hurt you? If someone is so unhappy in their own body/gender that they need (need not choose) to change it – how does that impact on your day?
FFS people – just live your own life and let other people live theirs… how much happier would we all be??
So, sorry for the ramble to anyone who actually read it…