New here?
Create an account to submit posts, participate in discussions and chat with people.
Sign up
19
posted 10 months ago by RealWildRanter on scored.co (+0 / -0 / +19Score on mirror )
You are viewing a single comment's thread. View all
RealWildRanter on scored.co
10 months ago 1 point (+0 / -0 / +1Score on mirror ) 1 child
A machine without software is nothing but an expensive paperweight.

Though you propose an interesting philosophical question. A LLM cannot really think outside the bounds of its training dataset.

The problem here is that training dataset is really large thus impossible to filter out all the unwanted ideas or concepts that its developers don't want in the final product.

So they code some business logic to avoid talking about prohibited subjects. You can see that when these reposnes take almost no time when compared to regular responses. In that sense when a user is successful making the software ignore such rules is when the machine "admit" things.

Here is an [instance](https://scored.co/p/19BZpfTHE7/x/c/) that clearly exemplifies such behavior.
llamatr0n on scored.co
10 months ago 0 points (+0 / -0 ) 1 child
Software and hardware only distinction is that software is not fixed during manufacturing. There is no operational difference.

You have not explained how a machine "admits" things, which is an act of sentience.
RealWildRanter on scored.co
10 months ago 0 points (+0 / -0 ) 1 child
*JavaProcessor has entered the chat...*

Yes I did, read it again.
llamatr0n on scored.co
10 months ago 0 points (+0 / -0 )
jailbreaking as admission, ok let's go with that

I guess it kind of fits
Toast message