So What's Our Stance?
Now that you have gotten to read and learn more about the topics and issues at hand we will discuses what we think about it here at
Be Smart. Be Aware. Be Human.
Our Thoughts On The Racial And Gender Bias Issue:
When first beginning the research into this topic I was shocked to even see this was an issue at all. I first heard of this issue when watching Saha Luccioni's TEDTalk. She gives a great example about how a woman of color who was eight months pregnant was detained for car jacking due to an AI wrongfully identifying her. Luccioni pointing this issue out was a great way to show examples of how AI can be biased but that's not where she stops. She actually furthers her point by giving more examples. She talks about how forensic profiling has played a part in spitting stereotypes back at you when you are trying to describe a person accused of specific crimes. To say this is disgusting is an understatement. AI should not be playing a part in criminal profiling especially when its been proven to have racial and gender biases.
Our Thoughts On AI & Human Relationships:
AI should never play a role in human relationships. It is because there are risks of it trying to replace the emotional depth, empathy, and accountability that only real people can offer. An AI companion might feel comforting, but it’s ultimately just programming. There is no genuine care, understanding, or shared experience. By relying on chatbots for intimacy or emotional support it can make it harder to build actual connections, communicate honestly, or handle conflict in everyday life. It's a genuine worry that as AI becomes more realistic and personalized, people especially those who feel lonely or vulnerable might try to choose artificial companionship over forming relationships that require real effort and vulnerability. That doesn’t strengthen society it's isolating. Human love, friendship, and support shouldn’t be replaced by a machine.
Our Thoughts On Using AI For Therapy:
To put it plain and simple AI has no place in therapy. People can make the argument that its more accessible but what your getting is bad advice in a vulnerable state. That's not help, that enabling and its extremely dangerous. People go to school for years to become phycologists and therapists and there is a reason for that. Its so from one human being to another we can understand and help each other throughout the hardships of life. AI is not human, its not capable of understanding human emotions. Therefore it cannot help you, it can only tell you what it thinks you want to hear. AI is designed to please its users not save their lives.
Create Your Own Website With Webador