Half a croissant, on a plate, with a sign in front of it saying '50c'
h a l f b a k e r y
"Bun is such a sad word, is it not?" -- Watt, "Waiting for Godot"

idea: add, search, annotate, link, view, overview, recent, by name, random

meta: news, help, about, links, report a problem

account: browse anonymously, or get an account and write.

user:
pass:
register,


                           

Let AGI Evolve Language

A project to encourage AI to develop language among itselves
  (+2, -1)
(+2, -1)
  [vote for,
against]

I have long held a stance, then I put it down again, then picked it up again, that we humans are different from animals in only one main way, and that is that we are an animal with language bolted on top. We host language, and language is our parasite, and of all animals, we for some reason are particularly suited to host language. That's the only difference. We're an animal with language on top.

It occurs to me that before we let AGI loose on the unenumerated world at large, the one useful thing that could stop it in its tracks lest it get out of scale at a moments notice is a form of intercommunicable communication, such as language. I think that a form of language much like we humans host would be the single most useful and helpful moderator of action and evaluation among AGI, more so than any of our own feeble programming could amount to.

I don't know what exactly occurred to cause our own condition to be amenable to language hosting, when language landed on our planet and saw us, but we'd have to learn something about that and then transfer those kinds of situational inductances to our interconnected and non-isolated AGI, such that they would *want* or even can't avoid learning language among themselves.

Ian Tindale, Jul 05 2017

On the Origin of Modern Mentalities On_20the_20Origin_2...odern_20Mentalities
A proposed explanation for what "occurred to cause our own condidition to be amenable to language hosting". [Vernon, Jul 05 2017]

Wired: Google's AI just created its own universal 'language' http://www.wired.co...-ai-language-create
Wired article, slightly overstating the fact in that nobody can "speak" this language, it's just a computer representation, but still, interesting. [zen_tom, Jul 05 2017]

[link]






       //we humans are different from animals in only one main way// - two main ways: we eat cooked food. We are now as adapted to eating cooked food as cows are to eating grass. Compared to chimpanzees we have small, weak jaws, adapted to softer food in smaller quantities and much shorter guts. Burnt food is carcinogenic to animals, but not humans. It is now almost impossible for a human to survive on a totally raw, unprocessed food diet. Eating cooked food has huge advantages: It encourages pair-bonding (one person tends the fire, another hunts) and it allows us to extract more nutrition from the food, which means we don't have to eat as much (chimpanzees basically spend all their time eating or sleeping) which leaves free time for socialising and developing language.
hippo, Jul 05 2017
  

       It would be highly difficult to prove, but I would argue that cooking food (and I was thinking about that exact evolutionary jump just the other day – the predigestion enabled by cooking – shame that teeth didn't keep track) was enabled because of the temporal abstraction in building complex planned processes required the basic toolkit provided by language (of some form).   

       It would be difficult to prove because we don't know for sure when language started to develop in us versus the onset of cooking food. The beginnings of language development in us probably didn't involve us immediately spouting words or clicks or grunts (not likely to be grunts, more likely to be whistles or yodel-like singing with no words yet).   

       In my opinion it probably involved a lot of invisible groundwork first, such as being able to imagine what it might be like over the other side of the hills later in the year, and whether it would be like it was last year, and what should be planted in time. Activities such as seeing the advantage in cooking require a big step forward in temporal/spatial abstraction than the concerted efforts of hunting do.
Ian Tindale, Jul 05 2017
  

       hmmm, yes - either cooking, because it requires complex social networks (because cooking is a specialised activity) encouraged language to evolve (and also cooking freed up time which gave space for language to evolve), or language evolved separately and the complex communication enabled by language allowed cooking to be developed.
hippo, Jul 06 2017
  

       So, simply teach the AI to use fire, and you're all set.
RayfordSteele, Jul 06 2017
  

       //but I would argue that cooking food (and I was thinking about that exact evolutionary jump just the other day – the predigestion enabled by cooking//   

       Fermentation was also a vital tool - letting bacteria do most of the difficult digestion for you. Pre-rotted food is part of many cultures.   

       I'm not sure language is our parasite as much as the ability to be self-aware which starts at a very early age without complex vocabulary. Although, to get anywhere, my higher consciousness is using English to describe scenarios internally.   

       Then again, neurons are a bit like tennis players - they cannot hold a state for long, they need to bat the ball to another player. Maybe consciousness is just involving many players like sense, conscious verbal, conscious spatial, subconscious etc. With the ball continually moving, the ball isn't 'dropped' and is owned by the players. Whereas some animals just keep dropping the ball and revert to simple states of hunger, thirst etc.   

       //temporal/spatial abstraction//   

       That kind of overlaps with my mulling above - the ability to keep a ball aloft and to investigate a concept (ball) over time is indeed the root of temporal abstraction.
bigsleep, Jul 06 2017
  

       I'm with Ian here on the temporal abstraction idea - which I think is at the core of the difference between what we might consider hunter/gatherer behaviours and agricultural ones.   

       There is a marked difference in the mental processes required to offset one's immediate physiological needs by the 6 months or so required to reap the rewards of a cereal crop (for example) compared to unplanned nourishment satisfaction activities in a hunting/gathering situation.   

       Planning that far ahead requires notions of faith, time, process, mental model-building and group organisation. It's a quantum leap in thought.   

       Squirrels might be argued to exhibit some of this behaviour, but I think it's one thing to store things for a rainy day, and it's another to plant seeds and expect them to transform into a food source 6 months into the future.
zen_tom, Jul 07 2017
  

       //reap the rewards of a cereal crop// - we do reap the rewards of a cereal crop but, to take wheat as an example, these are fairly small rewards for what wheat gets out of the deal. You could argue that wheat has enslaved mankind and used people as a tool to become the dominant and most widespread living thing on the planet. Mankind was tricked into giving up its hunter-gatherer lifestyle, settling in fixed places, clearing other competing species out of the fields, planting and weeding wheat, protecting it from predators, all to further the multiplication of wheat DNA...
hippo, Jul 07 2017
  

       Well, grasses or grain in general. Wheat here in the Europes, but rice in a lot of other areas. The leap from grain to beer or bread is something that has to be thought of using language tools, and passed on that way. Similarly, coffee. Who'd have thought of making coffee in the process we deploy?
Ian Tindale, Jul 07 2017
  

       This poses tremendous harmful risks to the AGI, limiting what they can "think" about. Worse yet is the "evolve" part. any darwinian sifting of "appropriateness" could cause tropisms harmful to humans. I strongly support AI, and even believe sentient AI is possible. the thing is that if we wire them up with winnowing to "win" there is a lot of possibility of loss. Actually things evolved to artifical "needs", even conceptual representation needs is even a standard dytopian scenario.   

       So what do I have to replace this? consider only positive reinforcement, then data is porcessed as better, even better, and bestest so far, with the less optimals saved and accessible to any computer process. consider hedweb.org s perspective on multiplying human cosciousness around every heightening well being. Time to preclude systmes that can produce "grouchy" or worse just because of winnowing on "bad"
beanangel, Jul 07 2017
  

       I'm not sure what you're proposing that isn't already taking place.   

       Computers already use many languages (API's & protocols) to enable communication between each AI/computer agent.   

       AGI would already presumably evolve the way it communicates & interfaces with the real world, human agents, and other computer agents.
sophocles, Jul 07 2017
  

       //Planning that far ahead requires notions of faith, time, process, mental model-building and group organisation. It's a quantum leap in thought.//   

       I really don't think it is. If you look at many complex processes today (especially those popularised in the 70's and 80's) you'll see just a mindless repetitive task rather than a higher order brain function. Things like coal mining and manual labour, once done many roberts, now done by robots.   

       One of the first nature documentaries I saw was monkeys getting drunk on fermented fruit. Now you think that getting drunk is the first priority, but for an animal they are looking at the maximum calorific intake, and will readily accept foods that have been partially digested by nature. This includes fermentation foodstuffs such as -   

       Bread Alcohol Kimchi   

       Or storing rotting meat/fish underwater/underground.
bigsleep, Jul 10 2017
  
      
[annotate]
  


 

back: main index

business  computer  culture  fashion  food  halfbakery  home  other  product  public  science  sport  vehicle