There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

[DISCUSSION] In TNG's "The Most Toys" (s3e22), did Data lie about firing the disruptor?

Episode premise:

Kivas Fajo is determined to add the unique Data to his prized collection of one-of-a-kind artefacts and, staging Data’s apparent death, he imprisons him aboard his ship.

We know that Data is later logically coerced to lie in “Clues” to protect the crew, but this appears to be a decision all his own. Or did he not in fact actually fire the weapon?

setsneedtofeed ,
@setsneedtofeed@lemmy.world avatar
ummthatguy OP ,
@ummthatguy@lemmy.world avatar

Reminded me of one of the best exchanges between Data and Worf from Gambit Part 2:

https://lemmy.world/pictrs/image/ffcce008-906f-4e17-9153-2927bcab2cbe.jpeg

Crackhappy ,
@Crackhappy@lemmy.world avatar

And worf has no idea …

Itdidnttrickledown ,

Lets face it, data was gonna cap him. The bad dude had killed someone just before.

grue ,

Data: “Perhaps something occurred during transport, Commander”

Riker: “Like what?”

Data: “Like I tried to shoot the motherfucker but you beamed me away too quick.”

RizzRustbolt ,

O’brien: “Bitches be crazy…”

samus12345 ,
@samus12345@lemmy.world avatar

“Deactivating it” meant he stored it in another pattern buffer for future use. Could always add it to a troublesome person’s transporter beam…

CodexArcanum ,

The idea that Data completely lacks emotion was always hollow to me. People don’t especially understand what emotions are or what it means to feel them. I think we lie to ourselves quite a lot that our decisions are “purely rational” even though everything in our environments influences those decisions.

Who hasn’t made a bad call because they were: tired, hungry, over heated, angry, or otherwise being affected by emotions? Is hunger an emotion? When Data decides that he will practice music today, is that part of an elaborate schedule he has planned years in advance, or is that what he “felt like” doing that day? Perhaps a long string of logic could explain why today is a good day to Vi-olen, but how is that different from the rationale I could put together for why I made a decision?

So what I’m saying is: I blame the writers! I think by season 3 they’d explored a little of the possibilities with Data about what humanity is and what it means to work with an android, but I don’t know that they ever really got a handle on what it would look like for a being of pure reason to emergently develop emotions.

Look at chatgpt and how readily it convinces people that there is a thinking being in there. When an LLM says “I’m happy to see you today, what can I do for you?” do we take that as a canned response with no real feeling behind it, or do we assume that because it can say it is happy, that it must be feeling happy?

Do we get much perspective on Data’s interiority? Perhaps he experiences a world of emotions we can’t even comprehend but has no understanding of how to express these things? His art work is called out as being soulless and copy-cat at various times. But also Data has a cat, and a daughter, and many friends. He tells bad jokes. It seems like there’s some kind of feelings going on in there, even if it comes out in his actions and not in his art.

pjwestin , (edited )
@pjwestin@lemmy.world avatar

This is actually why I kinda liked Pulaski. When she first meets him, she calls bullshit on the idea that he has no emotions. She mispronounces his name, and when Data corrects her, she immediately hones in on that; why should you have a preference if you don’t feel anything about it? But people keep telling her that he doesn’t have emotions, so she’s basically like, “Alright, fine, I’ll treat him like a calculator.” And honestly, why wouldn’t you? If he doesn’t have emotions, then why bother with pleasantries? It’s not like he’s going to get offended. She eventually does come around to him, but she does that by basically coming to the conclusion that he does care, given his actions during that episode with the children and the aging disease (I don’t remember the name and I’m too lazy to look it up).

It was always very clear to me that Data had emotions. How could he not? He has desires, wants, preferences…you can’t have those things without feeling something. It just seems like they’re very distant, numb feelings, rather than strong sensations. And it kinda makes sense to build him that way; Lore was created with much more advanced emotions, and he’s a little psychotic. It makes more sense to have his feelings be slightly out of reach and let them grow with his positronic brain, so he can learn to handle them over time.

I never liked the, “emotion chip,” solution to Data’s feelings. It seems like they never explored his emotional development because they didn’t want to make any status quo changes on a mostly episode-of-the-week show. Then they created an emotion McGuffin they never intended to use and said, “fuck it, let’s use it for the movie!” But in the end, I believe we were always meant to think the same thing about Data as we were about Spock: “I know this guy says he doesn’t have emotions, but I think he’s full of shit.”

1stTime4MeInMCU ,

Why did Fajo believe data couldn’t kill? We see data blasting baddies all the time

setsneedtofeed ,
@setsneedtofeed@lemmy.world avatar
Cagi ,

Anyone who thinks he didn’t fire with the intention to kill Fajo needs to go back to English class and learn how to read some basic literature. It’s like the end of the Sopranos. People’s wishes for happy endings and perfect Hollywood stories blind then to the work the writers went through to tell you (rather obviously, there isn’t much room for debate among people who know how to interpret stories) that yes, Data can kill an unarmed man in the right circumstances, or yes, Tony Soprano’s brains are splattered all over his family. It’s not a happier story but it’s a better one with actual meaning and has a more lasting impact.

ummthatguy OP ,
@ummthatguy@lemmy.world avatar

I believe that Data has it in him to make that decision, I’m mostly calling out the ambiguity of the scene as it played out. And yeah, Tony met a gruesome but earned end.

setsneedtofeed ,
@setsneedtofeed@lemmy.world avatar

It’s like the end of the Sopranos.

They ran out of film?

BradleyUffner ,

He didn’t lie; he didn’t answer the question.

Windex007 ,

Riker didn’t actually ask a question, he just made a statement.

themeatbridge ,

Did he lie? Or did he give a vague statement that is necessarily true?

teft ,
@teft@lemmy.world avatar

Data is an Aes Sedai confirmed.

Draegur ,

WoT references! a rare treasure from a long lost age (which may hopefully come again…)

teft ,
@teft@lemmy.world avatar

The only community I truly miss from reddit is wetlander humor.

HuntressHimbo ,

There is a wetlander humor but it is very low activity

teft ,
@teft@lemmy.world avatar

Sorry, should have said an active wetlander humor community. I’m actually a subscriber on that community.

HuntressHimbo ,

The question is what Ajah he would end up in. The Whites, Grays, and Browns would all want him, hut he might be a Blue at heart

Bishma ,
@Bishma@discuss.tchncs.de avatar

O’Brien deactivated the weapon, so something did happen during transport.

pizza_the_hutt ,

Contrary to popular belief, there is nothing in Data’s programing that would prevent him from doing “bad” things like lying or killing. He has free will just as much as any other Starfleet officer.

Data is much more human than you might guess at first. He is more akin to a human on the autism spectrum than a robot with hard-coded programming.

ummthatguy OP ,
@ummthatguy@lemmy.world avatar

Absolutely. Rewatching the series in full as an adult made it more apparent that Data was always closer to his goal than he could comprehend. Just had trouble adjusting to social “norms” more than others.

CptEnder ,

Yup exactly. He just lacked emotional subroutines (at first) and the hardware to process that. But he doesn’t need emotions to kill. He is in fact capable of using lethal force (First Contact), he just has an ethical subroutine that prevents killing (Descent I, II) unless in defense of others, himself, or The Federation. Which would fall under his logical subroutines.

Similar in a way to Chief Engineer Hemmer who will not use violence (Memento Mori) unless in an act of preserving life. The means to defend is part of the training of a Starfleet officer.

Bishma ,
@Bishma@discuss.tchncs.de avatar

We know he’s figured out bloodlust by Generations.

https://www.reactiongifs.com/r/dstfp.gif

_stranger_ ,

Data’s the science officer. He could probably build a phaser from scrap blindfolded. Him saying maybe “something” happened during transport is clearly a deflection. I bet he thought about this moment when he discovered Lore and all Lore had done.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines