User blog comment:Banned In CP/A Story Everyone Here Should Read/@comment-26399604-20180522151838

Thanks for sharing. This was a nice read. I can see how it could make you question your perception of the world, but really I think that it falls back more on social norms. People have gotten to a point where once things are accepted in society that they just adhere to them without second-thought, so they run through the "algorithm" response.

Look at "small talk". It's typically littered with moments the story alludes to -- predictable conversational pieces about the weather, spots, or whatever. When you think about it, what's its real purpose? To lighten the mood? eliminate the awkward silence? I see it as means to weigh the persona of the individual your with -- like a scan to verify if someone is still in compliance of a healthy mind regardless if you know them or not. If a person's response was to suddenly lash out at you or ignore you, then you know that person, depending on your relationship with them, should be either avoided or require further investigation.

The story's main stance is that our thoughts and words might be a by-product of programming just the same as any computer, which is a scary thought. We're talking about the complete dismissal of free-thinking as we believe it to be and I can understand Elena's (the protagonist) mindset as a fellow programmer myself.

However, this is why I never thought artificial intelligence could ever truly be sentient. AI code is limited to its architecture, which in turn is limited due to the fact that our knowledge is limited because we programmed it. Computers aren't smart and will one day have an epiphany on life (sorry Icy :P); they're just able to compute things at insane speeds and, unless "told" otherwise to stop, would do so until it fried its own system. It's not that we resemble computers, but that they are an imitation of us. They run algorithms because it's how we operate and it's easy to twist that into a type of Penrose Staircase logic.

People do run through the motions of things, recite the same reactions and phrases. However, we are able to adapt and learn (we illustrate that in computers through loops and conditional statements -- if mark waves then wave back, else keep walking. Loop when necessary) but that's because we've been instructed through our parental figures, society's teachings, and most cases, personal experiences.

A computer will only be able to adapt based on the list of methods we give it. It'll be able to know that it shouldn't do something through trial and error. Since that's how people understand how to learn, we constructed machines to follow that same road map because we don't know otherwise how to. A machine never think to look for other possible outcomes to get ahead of a recurrence like a person can.

Aside from that fact Elena started to cut herself, you could argue that someone who decides to go against the "norm" (while still within the law) are seen as an abnormality and are shunned, and in some cases, committed as she was. Computers can be the same way with malware, viruses, etc. Yet, I think that seems more in line with the "sheeple" argument -- people blindly adhering to things without thinking for themselves while perpetuating those who don't fall in line-- more than an algorithm that might say we're machines running a list of commands ourselves.

Overall, people are open to believe what they want. Am I saying true sentience is impossible for a machine? No. I just think it'll only happen when our own intelligence reaches a plane above where we are now. We're too focused on ridiculous things that divide us rather than large feats such as this. Regardless, I think this is a fun thought to ponder on. Again, thanks for sharing this!