Language, Sounds and Intelligent Design

Chu, sorry for asking this before actually digging in;
Chomsky has felt a little off to me for a while but lacking quite a lot of background it's difficult to argue why exactly apart from some of his very recent interviews. I am very interested in language specifically in its uhm, inter-relationship with other scientific disciplines and particularly philosophy and metaphysical considerations of the description of things interacting with their state. Aside from some base level quantum waffle, Chris Langan comes to mind immediately as being the only person i know of to tie these two things together in a seemingly wholesome way thus far, but i digress on that particular avenue.

I have written a few posts in the past about information theory and everything computation related starting to weave into this topic wondering what it means and where it is heading, and whether we are heading in the right direction, thinking of Turing, Shannon, then everyone that followed 'recently' and i would assume a still on-going debate about the differences and overlaps between natural languages and formal/ constructed ones.

Who should i be reading to gain some insights from minds that did not get enticed by materialist reductionism in these fields, if that would indeed be the main criticism, and just in general too?
 
Chomsky has felt a little off to me for a while but lacking quite a lot of background it's difficult to argue why exactly apart from some of his very recent interviews.

Hmm, well, I can relate to that. For years I felt something was off, but he speaks with such authority and convoluted statements, that I assumed I was missing a big thing. Now, the more I read, the more I associate him with important characters like Freud.
Chomsky: "we have an innate capacity for language" (Freud: "we have a subconscious").
That's about it. a big important truth, but no substance. For the rest, he is such a materialist, that his theory avoids anything that doesn't have to do with computers or an alleged genetic endowment that nobody has seen yet, in spite of 20 years of searching.


I am very interested in language specifically in its uhm, inter-relationship with other scientific disciplines and particularly philosophy and metaphysical considerations of the description of things interacting with their state. Aside from some base level quantum waffle, Chris Langan comes to mind immediately as being the only person i know of to tie these two things together in a seemingly wholesome way thus far, but i digress on that particular avenue.

I have written a few posts in the past about information theory and everything computation related starting to weave into this topic wondering what it means and where it is heading, and whether we are heading in the right direction, thinking of Turing, Shannon, then everyone that followed 'recently' and i would assume a still on-going debate about the differences and overlaps between natural languages and formal/ constructed ones.

Could you share the links, please? I'd be interested to read them.

Who should i be reading to gain some insights from minds that did not get enticed by materialist reductionism in these fields, if that would indeed be the main criticism, and just in general too?

If you want to get a pretty good overview of Chomsky's biais, you could read The Kingdom of Speech, by Tom Wolfe. It's not scholarly, and his conclusion is pretty useless IMO, but the review of the competing theories is very interesting and I think you'll see where the study of language became super materialistic.

For a perspective that, although not stated explicitly, makes "computational linguistics" look like a science in diapers, I would recommend "Metaphors we live by", by Lakoff and Johnson. Also not super scholarly, just an introduction. Then, if THAT interests you, "Cognitive Grammar" by Ronald Langacker (more technical but fascinating). These two show, in my opinion, that Shannon's theory was way too simple compared to Language. Also Reddy's, with his "conduit metaphor", if you are familiar with it.

That's what comes to mind based on what you say interests you:
I am very interested in language specifically in its uhm, inter-relationship with other scientific disciplines and particularly philosophy and metaphysical considerations of the description of things interacting with their state.

You will have to expand based on your particular interest as you go along, but I think the books I mentioned above are much more worth it than reading all of Chomsky's work, which focuses almost exclusively on syntax, because he considers "mind" and meaning as being outside of the realm of science and "proper linguistics".

I hope this helps, though I'm far from being an expert in this area. I'm trying to wrap my head around it before making new videos, but it's tied to so much we think we know about language that it's a bit "intimidating".
 
i would assume a still on-going debate about the differences and overlaps between natural languages and formal/ constructed ones.

Just wanted to add here that, sadly, I don't see a whole lot of debate. I think that the issue itself will become more prevalent because of AI and language modules, but in Linguistics proper, the comparisons between natural and formal languages only go so far. There is layer upon layer of complexity that doesn't exist in formal languages (nor in AI for the time being?). The Chomsky camp will explain that as "biological/physical/genetic endowment not yet found", different but similar types of "modules", etc. Most others don't seem to even bother comparing natural and formal languages, because for them, it's like comparing apples and oranges. I'm generalizing, of course. But the main problem I see is that both camps are still stuck in materialism. For the one, we are like computers; for the others, we are just an accident of nature, and evolved from monkeys. Anything that doesn't fit gets relegated to philosophy or psychology.
 
I'll have to apologize for a bit of grandstanding in the earlier posts; my knowledge of any subject at hand here is fairly limited. Homo Universalis seems so far out of reach i've aimed for something rather different; a summarizer of everything comprehensible on a short term, that i come across.

Thank you so much for the indepth and detailed replies that are honestly slightly more then i bargained for. First of all there is this thread that ties into things directly in a very important way, we assume that human communication can be purely logical or rational but that is probably just nonsense: Mystery and Order: the left and right hemispheres, it is not a binary question and therein lies a huge part of the problem (comment of mine at the very bottom, but nonsensical without the full post).
The post i was referring to primarily is this Curt Jaimungal: Humor and Free Will -

Being two years older and wiser, in these strange times, i would have formulated that particular post differently without a doubt, but the gist stands, and it also suggests something worth discussing further; in the process of trying to map every possible kind of 'information exchange' - for absolute clarity there can be no ambiguity. This is a core premise of several programming languages. However herein lies the problem that natural and formal languages equated assume people to be 'logical machines' - even if that means logic by way of 'nearly infallible reactive psychology patterns' acting out. Human sciences have gotten lumped in with computer science for some reason, and more importantly a broad assumption now exists that every exchange we make is rooted in the realm of physics too.

While money is energy to some extent, the conflation of terms and disciplines runs very very deep and leaves an enormous web to untangle. I apologize if this was a bit rambly, and not entirely to the point, but it's just the start, so much more to come.
 
Thank you so much for the indepth and detailed replies that are honestly slightly more then i bargained for. First of all there is this thread that ties into things directly in a very important way, we assume that human communication can be purely logical or rational but that is probably just nonsense: Mystery and Order: the left and right hemispheres, it is not a binary question and therein lies a huge part of the problem (comment of mine at the very bottom, but nonsensical without the full post).
The post i was referring to primarily is this Curt Jaimungal: Humor and Free Will -

Well, I actually enjoyed reading those two posts, thanks. That was a very good summary with interesting comments regarding McGilchrist's book. And I actually hadn't thought about the points you made about humor. Good job.

I'd repeat my recommendation of the books I mentioned above. The three of them are related to what you wrote! But the one that would give you the most to explore is, I think, Langacker's. I was actually thinking about his diagrams of meaning when reading your texts. Humor may work as well, because it distorts the "usual logic" (or construals) that we form. Mind you, his is also a model, not a Left Hemisphere proven truth, but as you pointed out so well, those aren't much use anyway without the Right Hemisphere and experience.

Basically, the idea is that in Linguistics, formal models have also become super prominent, but don't describe much. His main argument is that EVEN grammar (which for generativists/Chonskyans is just a "module", completely divorced from meaning) is full of what you would describe as Right Hemisphere activity too. To catch every single subtle difference in meaning, we need both hemispheres. And we form layer upon layer of meaning, thanks to experience. This is "revolutionary" also because language is seated in the Left Hemisphere for the majority of people. Yet, it wouldn't work with just that. Language may be "analytical", but meaning is much larger than that.

Apologies if this is not very clear. I'm still researching this, and when I can explain it to a 5 year-old (ie. when I understand it), I'll make videos about it. But in the meantime, I reckon it can help with your own studies. I look forward to reading what you think if you decide to read any of these books.
 
Yes! I will dig into the primary suggestion asap at the very least. Been way too lax with reading and keeping up with the world in general. Whenever i do so it's also easy to get 'sidetracked' by many things, because i feel many of these topics have a relation that needs to be looked at some point. A little like endocrinology used to 'separate' certain organs from the rest of the body in their observed function and action, with it now becoming clear that nothing in a system this complicated operates truly indepently.

An 'emergent' (i loathe the -traditional evolution connotations here-) topic here is AI obviously; ChatGPT4 - I promise i'll get some more diverse sources but places for meaningful and level-headed discussion have been sparse.
 
That latest link requires a little background; there is mention of symbol manipulation and it's relation to interpretation, and the question to what extent AI systems might be capable of the latter, if at all. Certain schools of thought have argued for mathematics as the 'baseline' of languages as it is assumed to be unambiguous, and therefore it would be a logical foundation of communication. This is part of the appeal of 'modern science', as a promise of an actual 'solution to reality' (a strange phrase on second look). Theories of everything have never been particularly popular before our current age, although they existed. Now many deeply interesting things are diverging and converging at the same time it seems:

https://yewtu.be/watch?v=xHPQ_oSsJgg

I have watched several presentations and interviews with Stephen already, and this one is particularly dense, will have to get back to it. The main question i had was 'what is computational irreducability, and how do you explain it in a philosophical framework?'. Ignore that question for the sake of the current topic, unless you feel inclined differently.

The branching/ rulial operation of things is an interesting spin on the regular binary view of the 'general modus operandi' but i can not shake the feeling that people have generally been incapable of integrating their 'hemispheres' (which i see as the physical component of a metaphysical process) - and then have gone on to project their lack of understanding of their own nature onto 'accepted notions of how language and information works'.

Of course, if all parameters involved have to be physical, or even currently measurable stuff, this way of looking at things will almost certainly be perenially incomplete.

P.S for those who want to dig even further into the current 'hype' and developments; Sam Altman: OpenAI CEO on GPT-4, ChatGPT, and the Future of AI | Lex Fridman Podcast #367
Lot of stuff to unpack there, later ;)
 
The Darwinists have problems, there clearly is a missing link in languages.
It’s tempting to think that monkeys have hidden linguistic depths to rival those of humans but as Ouattara says, ā€œThis system pales in contrast to the communicative power of grammar.ā€ They monkeys’ repertoire may be rich, but it’s still relatively limited and they don’t take full advantage of their vocabulary. They can create new meanings by chaining calls together, but never by inverting their order (e.g. KB rather than BK). Our language is also symbolic. I can tell you about monkeys even though none are currently scampering about my living room, but Ouattara only found that Campbell’s monkeys ā€œtalkā€ about things that they actually see.
For context, the whole article is:
Boom-boom-krak-oo – Campbell’s monkeys combine just six ā€˜words’ into rich vocabulary
BYED YONG
PUBLISHED DECEMBER 7, 2009
• 7 MIN READ
Many human languages achieve great diversity by combining basic words into compound ones – German is a classic example of this. We’re not the only species that does this. Campbell’s monkeys have just six basic types of calls but they have combined them into one of the richest and most sophisticated of animal vocabularies.
By chaining calls together in ways that drastically alter their meaning, they can communicate to each other about other falling trees, rival groups, harmless animals and potential threats. They can signal the presence of an unspecified threat, a leopard or an eagle, and even how imminent the danger is. It’s a front-runner for the most complex example of animal ā€œproto-grammarā€ so far discovered.

Many studies have shown that the chirps and shrieks of monkeys are rich in information, ever since Dorothy Cheney and Robert Seyfarth’s seminal research on vervet monkeys. They showed that vervets have specific calls for different predators – eagles, leopards and snakes – and they’ll take specific evasive manoeuvres when they hear each alarm.

Campbell’s monkeys have been equally well-studied. Scientists used to think that they made two basic calls – booms and hacks – and that the latter were predator alarms. Others then discovered that the order of the calls matters, so adding a boom before a hack cancels out the predator message. It also turned out that there were five distinct types of hack, including some that were modified with an -oo suffix. So Campbell’s monkeys not only have a wider repertoire of calls than previously thought, but they can also combine them in meaningful ways.

Now, we know that the males make six different types of calls, comically described as boom (B), krak (K), krak-oo (K+), hok (H), hok-oo (H+) and wak-oo (W+). To decipher their meaning, Karim Ouattara spent 20 months in the Ivory Coast’s Tai National Park studying the wild Campbell’s monkeys from six different groups. Each consists of a single adult male together with several females and youngsters. And it’s the males he focused on.

With no danger in sight, males make three call sequences. The first – a pair of booms – is made when the monkey is far away from the group and can’t see them. It’s a summons that draws the rest of the group towards him. Adding a krak-oo to the end of the boom pair changes its meaning. Rather than ā€œCome hereā€, the signal now means ā€œWatch out for that branchā€. Whenever the males cried ā€œBoom-boom-krak-ooā€, other monkeys knew that there were falling trees or branches around (or fighting monkeys overhead that could easily lead to falling vegetation).

Interspersing the booms and krak-oos with some hok-oos changes the meaning yet again. This call means ā€œPrepare for battleā€, and it’s used when rival groups or strange males have showed up. In line with this translation, the hok-oo calls are used far more often towards the edge of the monkeys’ territories than they are in the centre. The most important thing about this is that hok-oo is essentially meaningless. The monkeys never say it in isolation – they only use it to change the meaning of another call.

But the most complex calls are reserved for threats. When males know that danger is afoot but don’t have a visual sighting (usually because they’ve heard a suspicious growl or an alarm from other monkeys), they make a few krak-oos.

If they know it’s a crowned eagle that endangers the group, they combine krak-oo and wak-oo calls. And if they can actually see the bird, they add hoks and hok-oos into the mix – these extra components tell other monkeys that the peril is real and very urgent. Leopard alarms were always composed of kraks, and sometimes krak-oos. Here, it’s the proportion of kraks that signals the imminence of danger – the males don’t make any if they’ve just heard leopard noises, but they krak away if they actually see the cat.

The most important part of these results is the fact that calls are ordered in very specific ways. So boom-boom-krak-oo means a falling branch, but boom-krak-oo-boom means nothing. Some sequences act as units that can be chained together to more complicated ones – just as humans use words, clauses and sentences. They can change meaning by adding meaningless calls onto meaningful ones (BBK+ for falling wood but BBK+H+ for neighbours) or by chaining meaningful sequences together (K+K+ means leopard but W+K+ means eagle).

It’s tempting to think that monkeys have hidden linguistic depths to rival those of humans but as Ouattara says, ā€œThis system pales in contrast to the communicative power of grammar.ā€ They monkeys’ repertoire may be rich, but it’s still relatively limited and they don’t take full advantage of their vocabulary. They can create new meanings by chaining calls together, but never by inverting their order (e.g. KB rather than BK). Our language is also symbolic. I can tell you about monkeys even though none are currently scampering about my living room, but Ouattara only found that Campbell’s monkeys ā€œtalkā€ about things that they actually see.

Nonetheless, you have to start somewhere, and the complexities of human syntax probably have their evolutionary origins in these sorts of call combinations. So far, the vocabulary of Campbell’s monkeys far outstrips those of other species, but this may simply reflect differences in research efforts. Other studies have started to find complex vocabularies in other forest-dwellers like Diana monkeys and putty-nosed monkeys. Ouattara thinks that forest life, with many predators and low visibility, may have provided strong evolutionary pressures for monkeys to develop particularly sophisticated vocal skills.

And there are probably hidden depths to the sequences of monkey calls that we haven’t even begun to peer into yet. For instance, what calls do female Campbell’s monkeys make? Even for the males, the meanings in this study only become apparent after months of intensive field work and detailed statistical analysis. The variations that happen on a call-by-call basis still remain a mystery to us. The effect would be like looking at Jane Austen’s oeuvre and concluding, ā€œIt appears that these sentences signify the presence of posh peopleā€.

Reference: PNAS doi:10.1073/pnas.0908118106
 
Back
Top Bottom