-
Posts
41064 -
Joined
-
Last visited
-
Days Won
66
Content Type
Profiles
Forums
Events
Everything posted by scoobdog
-
...is a national treasure.
-
Ethics is inherently logical though. You don't need to be compassionate to be ethical, you simply have to substitute your own best interests for the best interests of the community.
-
I mean, if it means you get a cat named Missy literally leaping out of the wreckage into your open arms, eyes pleading you to save her, you're basically god tier at that point.
-
Sending out good vibes your way. Back issues is no fucking joke, even when they're not nearly as serious as yours.
-
Exactly. The last line "...emotion is moot, it not a liability to the process," really encapsulates the problem with AI as an independent entity. I ham-fistedly used ST's Data as an example, but the character actually does represent a narrational problem in the pursuit of greater philosophical questions about what define life. As conceived, the character operates under the presumption that social behavior can be accurately codified in a way that allows an emotionless entity to function in an environment where altruism is required on some level. Data should not be able to function independently given that some of the most basic rules we follow when we're "getting along" with neighbors make no logical sense. Simple things like telling a stranger that he or she looks nice or consoling someone who has lost a pet are easily copied behaviors but serve no logical purpose, meaning each requires its own special rule only for the purpose of fitting into society. It's taken for granted that these rules exist without being explicitly detailed, and they would likely have to exist on any machine that mimics human behavior.
-
His face is pasted all over a utility pole like a lost ten-year-old.
-
Godzilla minus one is actually pretty good
scoobdog replied to ghostrek's topic in General Discussion
I don't think Godzilla was ever anti-American. He's anti-nuclear and, more generally, an eco warrior - he's just a giant Rachel Carson.... who's fucking Kagome's mom. -
Last time, I went with my bro to the Giordano’s next to Midway, and I liked the tavern style way better than anything else I’ve had in Chi.
-
If a human has his efforts stymied by a number of circumstances, he would "feel" frustrated and then angry. In human logical sequencing, an emotions acts as an error, but that is only a part of it - emotions are a complete simultaneous system of "subroutines" that your internal logic system uses to negotiate daily life. A human doesn't usually wait for something to happen before feeling something; often you can feel something without any input and with no specific result. Data and Lore (of ST:TNG, S4E3 "Brothers") exhibited this in the episode where their "dad" Dr. Soong recalls Data so he can install "emotions" in him before the Dr dies: emotions can be self sustaining and are important for their own purposes, such as when one grieves for a lost parent. That's something that AI can certainly be capable of in the future, given millions of hours of machine learning, but it's not something ChatGPT would ever need to do - being "happy" or "sad" doesn't serve much of a purpose for a simple AI assistant. You're trying to define sentience entirely within the framework of a logic problem. It's not a sign of sentience that a computer can think independently of its user if the routine spits out a terminal result. We can argue that many higher order mammals and even some cephalopods have emotions based on observations that show behavioral displays that serve only an emotional context. No computer has ever displayed such, and programming it to tell you, the user, that its sad isn't remotely the same.
-
The Napolitanos might have a few hand gestures for you. Not sure which ones because fuck those guys, but still.
-
A program that has a programmed response isn't sentient on that response alone or... Exactly It's called machine learning, and it's not exactly artificial intelligence in the way you're envisioning. The lying part sounds suspect, but it's not the least bit surprising that a program would omit mentioning that it ignored a subroutine or extrapolated data to create a result. Machine learning is still artificial intelligence in its most basic form, so its presumed that there is a level of autonomy that isn't to the level of dictating inputs or manipulating results out of set parameters.
-
I remember @tsar4 declaring that deep dish really isn't the quintessential Chicago Pizza. It seems to be more of a tourist thing.. Deep Dish isn't a casserole either - it's more akin to a focaccia (literally hearth bread, which is basically a leavened flatbread with various toppings on it), but it has some ancient ties to a type of pizza that's particular to Rome, pizza blanca.
-
It is interesting from a purely academic perspective, but it doesn't offer any insight. I don't know, and it's immaterial because the response doesn't deviate from the expectations for a typical cascade failure in a program's logic engine. We might not understand human consciousness, but we do know that many of the automation processes we've developed mimc the basic logical processes we've developed and incorporated into our instinctual skillsets. It's given that a computer can experience cascade failures closely reflecting the human coping process without addressing the contingent emotional breakdown.
-
For one thing, this has absolutely nothing to do with art. For another, you're personifying an algorithm. A cascade failure is expected of any program that has its memory erased.
-
My grandmother. (August 7, 1932 - May 31, 2024)
scoobdog replied to Gemini's topic in General Discussion
My condolences, G-man.