Subject: Re: OT: GOOGL and AI
GOOGL shares rose 3.8% after the company announced that it would be be adding chat to search. People are going to love chatting with computers.
I wrote a bit about that on the Google board at Shrewd'm:
Ghat GPT will get heightened attention for quite a while, but after several years when Google can recognise the input domain and provide similar answers, they'll remain on top *and* the value of the search for the public (and thus Google's scope for advertising dollars) will have increased rather than decreased as investors might be fearing today.
https://www.shrewdm.com/MB?pid...
I did not think they would add it to the chat so quickly, but good on them. AI is hugely in attention now so it makes sense to hurry this along.
I just hope that AI chat programs can filter out all of the nonsense and propaganda in the world.
It is a noble wish but at least this current incarnation of AI chat might not be effective at avoiding propaganda. It repeats what is generally the consensus, which is already propagandised. Avoiding propaganda generally requires a very narrow and focussed research, and an effort hurdle to be surpassed. On the other hand, I often regard immigrant taxi drivers as having a more realistic view of the world than what I read in foreign relations section of The Australia or CNN, so in this sense AI software reporting news might actually be slightly more realistic than what is most commonly viewed in the press.
Regarding AI, what would you say is AI's most valuable ability? While things like chat, image recognition and autonomous driving are nice, I would venture that AI's most valuable ability is its ability to analyze data. Humans can analyze data, too, but computers are much faster and less prone to bias. Thoughts?
The present AI software does not do any analysis, nor does it have any understanding whatsoever of what it is writing. It is purely syntactic. It is astronomically sophisticated as as engineering system in merging text together to produce an illusion of embodying meaning, but the meaning is only what we produce ourselves after reading the text.
Grammatical language (about the only thing unique to humans, as all animals communicate but without grammar) very broadly works like this:
1. Person A holds a vast and incredibly rich model of a thought for which they will near-instantly thereafter produce a sentence. Noam Chomsky views this as represented more or less as a tree heirarchical structure for which there is a lot of evidence, but most importantly this internal representation of the thought is extremely nuanced and filled with information. For example, if I wanted to go down the street to buy some get more milk, I'll have a very rich understanding of what that means - where I'll walk, what I'll wear, which shop I'll use, how to select the milk, what I'll do with the milk when I return, what the purpose of it is, and far more.
2. During speech (or any other linear time representation, such as writing or sign language; they all share the same neural networks) the brain extremely efficiently converts the above ludicrously rich representation into an extremely sparse version which is the sentence that we write or speak. We just say: "I'll go down and get more milk".
3. Person B reads (or hears) the extremely crude representation, and then uses another cognitive system (reverse of 1) to extremely efficiently re-reform the same rich representation that person A originally had. We do it so quickly that it is obvious that we have dedicated cognitive system for both these conversion processes 1 and 3. In summary, language is ridiculously sparse and crude, but we all interpret it so efficiently that we often forget (ie, we are not aware) how lacking in information the sentences themselves are.
The AI systems are basically dealing with section 2 only, and not involved with the ludicrously more rich semantic/syntactic conversions systems of 1 or 3. They don't need to be, as it is ourselves that are then using our conversion systems (particularly 3) to interpret what the AI system is producing as having meaning, even though the AI system never dealt with 1 or 3, and has no understanding of anything it is producing.
- Manlobbi