←back to thread

Pope Francis has died

(www.reuters.com)
916 points phillipharris | 1 comments | | HN request time: 0.329s | source
Show context
jimmcslim ◴[] No.43750835[source]
The Vatican published an interesting document on AI [1], which attributes a number of quotes to Pope Francis:

* As Pope Francis noted, the machine “makes a technical choice among several possibilities based either on well-defined criteria or on statistical inferences. Human beings, however, not only choose, but in their hearts are capable of deciding."

* In light of this, the use of AI, as Pope Francis said, must be “accompanied by an ethic inspired by a vision of the common good, an ethic of freedom, responsibility, and fraternity, capable of fostering the full development of people in relation to others and to the whole of creation.”

* As Pope Francis observes, “in this age of artificial intelligence, we cannot forget that poetry and love are necessary to save our humanity.”

* As Pope Francis observes, “the very use of the word ‘intelligence’” in connection with AI “can prove misleading”

[1] https://www.vatican.va/roman_curia/congregations/cfaith/docu...

replies(4): >>43751790 #>>43752519 #>>43753454 #>>43753904 #
timeon ◴[] No.43751790[source]
> * As Pope Francis observes, “the very use of the word ‘intelligence’” in connection with AI “can prove misleading”

Yes, LLMs are more about knowledge than intelligence. AK rather than AI.

replies(5): >>43751865 #>>43752189 #>>43753296 #>>43754713 #>>43755357 #
lo_zamoyski ◴[] No.43755357[source]
Not even knowledge. Knowledge requires intentionality and belief as well as justification. LLMs don’t have any of these.

We ought to avoid anthropomorphizing LLMs. It is muddle headed.

replies(1): >>43756776 #
bmicraft ◴[] No.43756776[source]
I'd add that knowledge usually implies a claim to truth, while llms can only offer information with varying likelihood of being true.
replies(1): >>43761889 #
1. lo_zamoyski ◴[] No.43761889[source]
This is what intentionality is about. No intentionality, no truth.

An LLM doesn't deal with propositions, and it is propositions that are the subjects of truth claims. LLMs produce strings of characters that, when interpreted by a human reader, can look like propositions and result in propositions in the human mind, and it is those propositions that have intentionality and truth value. But what the LLMs produce are not the direct expression of propositional content, only the recombination of a large number of expressions of propositional content authored by many human authors.

People are projecting subjective convention onto the objective. The objective truth about LLMs is far poorer in substance than the conventional readings we give to what LLMs generate. There is a good deal of superstition and magical thinking that surrounds LLMs.