The Forgotten Exodus: The Into Africa Theory of Human Evolution

Other experts certainly think that neoteny’s role is reasonable. The ability of the brain to learn is apparently greatest before full maturity sets in, “and since neoteny means an extended childhood, you have this greater chance for the brain to develop,” says molecular phylogeneticist Morris Goodman of Wayne State University, who did not participate in this study. In other words, human evolution might have been advanced by the possibilities brimming in youth.

Thanks Dr.Laura Knight Jadczyk for assisting us in our research on OAT. The books suggested will go a far way in providing greater insight on our evolution. I will be ordering in a jiffy. I believe your title is well deserved for your contribution to philosophy, although there is not much in a title.
Also, I notice that the fact that domesticated animals go through a similar process when being selectively bred; one wonders who was selectively breeding early homo???
Similar observation made Jonathan Haidt in his "The Righteous Mind".
[...] The end result, says Boehm, was a process sometimes called "self-domestication". Just as animal breeders can create tamer, gentler creatures by selectively breeding for those traits, our ancestors began to selectively breed themselves (unintentionally) for the ability to construct shared moral matrices and then live cooperatively within them.
So he says we did it to ourselves, but then who knows...
And we still have the giant 7-9 foot skeletons sporting two rows of upper and lower teeth, from
eastern Siberia all the way to south America, I wonder where they would fit in?

And we still have the giant 7-9 foot skeletons sporting two rows of upper and lower teeth, from
eastern Siberia all the way to south America, I wonder where they would fit in?

Well, according to one idea, they were the Denisovans. I'm still reading to try and get a good handle on the whole mainstream schtick so that I can see what they are avoiding or skipping.

Very good book, and maybe essential, "Race and Human Evolution" by Wolpoff and Caspari.
I am deplorably ignorant on this fascinating subject. I will definitely begin a bit of digging starting with the The Forgotten Exodus. I was interested in the C's comment about what the scientists know and what they are not going to tell us.
Three new discoveries in a month rock our African origins
The evolutionary story of modern humans just got more complicated
Three new discoveries in a month rock our African origins


“Neo” skull of Homo naledi, on exhibit at Maropeng. John Hawks CC-BY
Four years ago, the story of modern human origins seemed fairly simple. Modern humans originated in Africa sometime around 200,000 years ago. Some modern people spread into other parts of the world sometime after 100,000 years ago, mixing a bit with archaic human groups they met along the way.
New discoveries have shown just how oversimplified this picture was. The common ancestors of today’s modern humans lived a lot earlier than we thought, and we can’t connect them to the fossil record. They were far from alone: Africa was full of other groups, now extinct, and some of them mixed Neanderthal-like into living populations.
The last month has seen more shake-ups to the modern human origins story than any time I can remember. Here’s what we have learned in the last few weeks about this key time period in Africa.
The deepest split
Modern humans share a lot with each other. We’re 99.9% genetically the same, and that impressive genetic similarity comes from inbreeding.
When geneticists first started measuring genetic differences between people, they realized that the population must have once been a lot smaller. They came up with the idea of a genetic “bottleneck”, a period in which the human population might have been very small.
But the last few years have added a lot of complications to this simple picture.


A result from PSMC studies of human genomes from different populations (indicated). The French underwent a clear bottleneck, starting between 100,000 and 50,000 years ago, which was actually shared by all other people outside Africa. But the African populations here had no strong bottleneck. The Khoe-San people in this chart have no sign of a bottleneck at all. From Mallick et al., 2016, doi:10.1038/nature18964
One problem is that the “bottleneck” concept applies mostly to people outside of Africa. They have inherited most of their genes from a small population that grew and dispersed throughout the world. That dispersal carried that signal of a “founder effect” along with it. Those people further mixed a small fraction with archaic humans, the Neanderthals and, for a few, the Denisovans. That founder effect unfolded within the last 100,000 years.

The story of the rest of the world during the last 100,000 years is not the story of Africa. Most humans continued to live in Africa and didn’t share the bottleneck. They preserve a pattern of diversity that goes much earlier back in time, meaning that today they are much more diverse in their genes.

In all the world, genetic diversity is greatest today among the Khoe-San peoples of southern Africa. Until this week, geneticists have thought that their ancestral lineage has existed for as long as 200,000 years. That origin, the deepest split between human populations that still exist, points to the stem population of all living people. It doesn’t divide Africans from non-Africans, it reflects a deep history of diversity among African populations — the founder effect leading to non-African peoples was much later.

That stem population, in genetic terms, is the origin of modern humans.

This week, Carina Schlebusch and colleagues posted a preprint that reports on the ancient genome of a boy who lived at Ballito Bay, South Africa, around 2000 years ago. His genome connects him to the Khoe-San peoples of today. But there’s an important difference. By comparing his genome with the genes of living people, Schlebusch and her coworkers found that today’s Khoe-San have received a lot of genetic mixture from other populations during the last 2000 years.

That’s not a surprise. Humans mix with each other, and that mixture has increased in recent times. Very ancient peoples mixed too, but migration and mixing were less in the distant past than in recent history.

What this means is that today’s Khoe-San people share many genes with people in other parts of Africa, and with people in Eurasia, from recent mixing. If we use today’s genetic diversity to try to work out the original ancestors of modern humans, we’re going to come up with an answer that is too recent, because of these recently shared genes.

Schlebusch and her team used the 2000-year-old genome to come up with a better estimate of when the stem population of modern humans lived. The answer was a lot older, around 260,000 years ago. Modern populations, the ancestors of living people, have been diversifying from each other within Africa for at least that long.

It’s a good reminder that within humans, a genetic “split” is not really a split. Populations of humans mix. That’s what we do. So when we look back into the past, we are looking at populations that reticulated with each other.

Who were these early modern humans?


Selenge River delta (Landsat)
Genetics tells us that the ancestry of African groups is like a river delta, spreading from some 260,000 years ago up to the present. But anthropologists really disagree about how to identify “modern humans” in the fossil record. A fossil that shares some human traits might be part of that delta network leading to some living populations. But just as easily, it might instead be off on its own branch, flowing until it disappears into the desert sands.

Archaic people in Africa — all the humans who were not part of that river delta — were not Neanderthals. So they didn’t share the features of Neanderthals, certainly not all of them. Many of them would have shared features with modern people. So how can we recognize them? And how many groups of them were there?

One big clue has been published just this week, by Jean-Jacques Hublin and his colleagues. They conducted new excavations at a site in Morocco called Jebel Irhoud. Old excavations carried out in the 1960s had produced parts of three skulls and some other human bones. Previous scholars had tried to work out the geological age of these remains, ultimately deciding they were around 150,000 years old.


Jebel Irhoud 1. Credit: John Hawks CC-BY
The new excavations, coupled with renewed geological dating, have yielded a new range of ages. These fossils and the archaeological layer they come from are somewhere between 220,000 and 380,000 years old. That overlaps with the time range of the first modern humans.

That seems like convenient timing. Do these fossils fit into the origin of modern humans somehow? Anthropologists are unanimous that the braincases of the Jebel Irhoud skulls don’t look much like modern humans. They are elongated, with an angled cranial rear. They aren’t Neanderthals, but it would be a big stretch to connect them to living people.


La Ferrassie 1 Neanderthal skull (top) compared to Jebel Irhoud 1 skull (bottom). The face of Jebel Irhoud is shorter than the Neanderthal, but it has a clear and continuous browridge, and its braincase is not very much like modern humans. Photo courtesy of Milford Wolpoff.
On the other hand, the faces and mandibles preserved in this collection are a bit more like those of modern humans. The faces of Jebel Irhoud 1, and the newly-discovered Jebel Irhoud 10, are a bit shorter, with cheeks are shaped more like modern humans than like Neanderthals. The brows of Jebel Irhoud 2 are reduced, like robust modern humans in their form, although Jebel Irhoud 1 has a stout and continuous browridge. The jaws from the site are robust and have big teeth, but they are taller in the front than in the back, and the midline of the jawbones are vertical.

Still, these human features are not the majority, and it’s not clear that they’re special. The jaws lack that most modern mandibular feature, the chin. Many of the humanlike aspects of the face can also be found in some much older fossil remains, such as the Homo antecessor fossils from Spain.

Hublin and his coworkers have drawn attention to the humanlike features, but they agree with other anthropologists that the Jebel Irhoud skulls are not modern humans themselves.

However, Hublin is proposing a big idea: These skulls, together with the whole population of Africa at the time they lived, was on the road to modern human origins. The paper describing this idea mentions another old fossil with a combination of archaic and modern features, a partial skull from Florisbad, South Africa, also clearly not modern human, but with a few humanlike features.


The partial skull from Florisbad might be in this time range, but the date depends on a tooth that may not be the same context or individual.
In Hublin and colleagues’ “pan-African” hypothesis, every African fossil that had parted ways with Neanderthals is part of a single lineage, a stem population for modern humans. They connect the evolution of these early H. sapiens people to a new form of technology, the Middle Stone Age, which was found in various regions of Africa by 300,000 years ago.

So how many other archaic groups were in Africa? Under the Hublin model, there may have been none. Every fossil sharing some modern human traits may have a place within the “pan-African” evolutionary pattern. These were not river channels flowing into the desert, every channel was part of the mainstream.

But there may be a problem. Geneticists think there were others.

Ghosts haunting our origins
So far, nobody has recovered ancient DNA from archaic human skeletal remains in Africa. The 2000-year-old Ballito Bay boy is not the oldest, but there are no DNA results from truly archaic specimens, like the Kabwe skull from Zambia. As a result, we don’t have the kind of record within Africa that geneticists have built for Neanderthals and Denisovans in Eurasia.

But a series of genetic studies on living Africans have succeeded in identifying signs of mixture from ancient people. This work, led by Michael Hammer, Jeff Wall, Joe Lachance, and others, is built upon a close examination of the genomes from living people. What they are finding is genetic ghosts.

The breakthrough of the Neanderthal and Denisovan genomes may not have led to the recovery of equally old African genomes, at least not yet, but having ancient DNA from those Eurasian fossils did give rise to some statistical clues about how to identify genetic mixture from ancient peoples. Using those clues, researchers are now able to identify “ghost populations”, ancient groups whose mixture with modern humans helps to explain some aspects of diversity in African genomes today.

These ghosts were the African equivalent of Neanderthals. They were ancient lineages with long histories, starting from long before any modern humans existed, and they were mixing with modern populations as recently as 20,000 years ago.


Skull from Iwo Eleru, Nigeria. Photo credit: Katerina Harvati and colleagues CC-BY
We don’t know what fossils might represent these ghosts. Some anthropologists have suggested possible candidates. One archaic-looking human skull, found from Iwo Eleru, Nigeria, even existed during this latest part of the Pleistocene when the last signs of mixture have been identified.

What we know is that such genetically divergent lineages, now gone, existed during the entire time modern humans were evolving, from well before their first appearance 200,000 years ago.

Our species was never alone. As the modern human delta was spreading toward the sea, it took in a few streams from the branches that were heading into the desert.

What this means for fossils like Jebel Irhoud is that we don’t know where they fit. There was a river delta, leading from a common ancestor population to the diverse groups of living people today. But the river leading to that delta was not one big channel; it was a braided series of streams.


The streams were not equal — some of them only contributed a small amount to modern humans, while one was dominant, contributing more than 90% of the ancestry of all living groups. That dominant stream, that special population of human ancestors, lived sometime before 260,000 years ago. Jebel Irhoud might have been on the mainstream, or it might have been part of one of the more minor contributing archaic groups, making up a percent or two of the genomes of living people. Or it might belong to a still more distant extinct branch, its facial features inherited from very ancient ancestors like H. antecessor and marking no special relationship with any modern humans.

The thing is, the fossils we have today are not enough. They cannot answer this question because they cover such a tiny fraction of the African continent.

The wild card
That brings us to the other major discovery of the last month: Homo naledi lived at the same time these events were unfolding.


“Neo” skull (left) compared to DH1 skull of Homo naledi (right). Photo credit: John Hawks CC-BY
Four years ago, I was part of a team working under Lee Berger in the Rising Star cave system in South Africa, where we recovered the first specimens of a new species, Homo naledi. Last month, our team published the first scientific assessment of the age of this new species, between 236,000 and 335,000 years old.

H. naledi existed at the same time that the modern human delta was emerging from the braided stream.

This species, which has a small brain and many primitive features, seems to have originated more than a million years ago. It was there when Neanderthals and Denisovans went first into Eurasia, and it was there at the dawn of our own species. It lived during the time that hominins were making Acheulean handaxes, and it lived as Middle Stone Age techniques began to proliferate in southern Africa. We cannot assume that H. naledi wasn’t the species that made these tools where it lived.

We don’t know what the interactions between H. naledi and other populations may have been. The fossil record isn’t good enough to say whether they existed in the same place as any other modern or archaic humans.


“Neo” skull of H. naledi (left), Omo 2 skull usually attributed to modern humans (right). Photo credit: John Hawks CC-BY
But we can’t rule out the idea that H. naledi may have even mixed with modern humans.

The problem is that we don’t know if these populations ever met. In part, this is because we only have a handful of fossils from tiny parts of the continent. In part, our lack of knowledge comes from a lack of good context. Many of the important fossils, like the Florisbad skull, and the Kabwe skull from Zambia, were recovered at a time when fossil contexts were not recorded with the level of precision as today. Methods of direct dating such fossils have improved, but each advance means that old dates may not be trustworthy. That is, after all, why we are talking about new dates for Jebel Irhoud right now.

The next frontier
This time is exhilarating for researchers like me, facing so many new and unexpected questions. The answer is to make more discoveries.

Morphology does not tell the story of modern human origins. The handful of fossils that exist in Africa in the later Middle Pleistocene share different features with modern humans. Even H. naledi shares some modern human features that are not present in the Jebel Irhoud skulls.

Did short faces and rounded braincases really make a difference to the survival and success of modern humans? Maybe they were chance legacies of the population that gave rise to our gene pool. We don’t know.

As many archaeologists have noted, the behaviors of living people around the world are hugely varied, and most of them are paralleled within the record of Neanderthals and other archaic people. One thing that H. naledi reminds us is that the development of technology is a broader issue than the origin of modern humans.

In Eurasia, Middle Paleolithic tool assemblages were made by Neanderthals, Denisovans, and modern humans when they arrived. In Africa, we have to assume that Middle Stone Age was also a product of multiple populations, at least archaic and early modern humans, and likely H. naledi.

Modern humans arose from a complex process. The answers to these questions are not simple. Our evolution may have encompassed many parts of Africa, but it certainly was not uniform, and right now it looks like the subequatorial part of the story was especially complex and interesting.

We have to discover more fossils. That’s the way that we will start to solve these new problems and shed light on old mysteries.

Like the article? You may want to check out our book, Almost Human: The Astonishing Tale of Homo naledi.
I've been wanting to check this thread since it was started and I still have to go more carefully through all the info posted but I wanted to mention that some time ago I came across an article from July, 2016, translated to Spanish from Nature's website that mentioned some of the research being made in China about human origins and some theories that could change the perspective of the conventional narrative about this matter. I've found it very interesting back then, so I've been trying to follow whatever comes up in the news about these discoveries in China. And it also mentions Stringer's work.

The article was on this URL, but it's not working for me now (it did a few hours earlier today). I tried to find it somewhere else but all other sources link to that same URL. Apparently, there's a full text available on ResearchGate. The Spanish translation is on SOTT, here.

It's a very basic article which doesn't give tons of information like all those books you mentioned, but I guess it can be a good introduction to those like me who don't have a clue... :-[ And maybe, some of the papers and researchers linked in the article can be interesting as well.
It’s working for me I’ll copy and paste it here just in case others have trouble opening the url.

How China is rewriting the book on human origins
Jane Qiu
12 July 2016
Article tools

On the outskirts of Beijing, a small limestone mountain named Dragon Bone Hill rises above the surrounding sprawl. Along the northern side, a path leads up to some fenced-off caves that draw 150,000 visitors each year, from schoolchildren to grey-haired pensioners. It was here, in 1929, that researchers discovered a nearly complete ancient skull that they determined was roughly half a million years old. Dubbed Peking Man, it was among the earliest human remains ever uncovered, and it helped to convince many researchers that humanity first evolved in Asia.

Since then, the central importance of Peking Man has faded. Although modern dating methods put the fossil even earlier — at up to 780,000 years old — the specimen has been eclipsed by discoveries in Africa that have yielded much older remains of ancient human relatives. Such finds have cemented Africa's status as the cradle of humanity — the place from which modern humans and their predecessors spread around the globe — and relegated Asia to a kind of evolutionary cul-de-sac.

But the tale of Peking Man has haunted generations of Chinese researchers, who have struggled to understand its relationship to modern humans. “It's a story without an ending,” says Wu Xinzhi, a palaeontologist at the Chinese Academy of Sciences' Institute of Vertebrate Paleontology and Paleoanthropology (IVPP) in Beijing. They wonder whether the descendants of Peking Man and fellow members of the species Homo erectus died out or evolved into a more modern species, and whether they contributed to the gene pool of China today.

Keen to get to the bottom of its people's ancestry, China has in the past decade stepped up its efforts to uncover evidence of early humans across the country. It is reanalysing old fossil finds and pouring tens of millions of dollars a year into excavations. And the government is setting up a US$1.1-million laboratory at the IVPP to extract and sequence ancient DNA.

The investment comes at a time when palaeoanthropologists across the globe are starting to pay more attention to Asian fossils and how they relate to other early hominins — creatures that are more closely related to humans than to chimps. Finds in China and other parts of Asia have made it clear that a dazzling variety of Homo species once roamed the continent. And they are challenging conventional ideas about the evolutionary history of humanity.

“Many Western scientists tend to see Asian fossils and artefacts through the prism of what was happening in Africa and Europe,” says Wu. Those other continents have historically drawn more attention in studies of human evolution because of the antiquity of fossil finds there, and because they are closer to major palaeoanthropology research institutions, he says. “But it's increasingly clear that many Asian materials cannot fit into the traditional narrative of human evolution.”

Chris Stringer, a palaeoanthropologist at the Natural History Museum in London, agrees. “Asia has been a forgotten continent,” he says. “Its role in human evolution may have been largely under-appreciated.”

Evolving history

In its typical form, the story of Homo sapiens starts in Africa. The exact details vary from one telling to another, but the key characters and events generally remain the same. And the title is always 'Out of Africa'.

In this standard view of human evolution, H. erectus first evolved there more than 2 million years ago (see 'Two routes for human evolution'). Then, some time before 600,000 years ago, it gave rise to a new species: Homo heidelbergensis, the oldest remains of which have been found in Ethiopia. About 400,000 years ago, some members of H. heidelbergensis left Africa and split into two branches: one ventured into the Middle East and Europe, where it evolved into Neanderthals; the other went east, where members became Denisovans — a group first discovered in Siberia in 2010. The remaining population of H. heidelbergensis in Africa eventually evolved into our own species, H. sapiens, about 200,000 years ago. Then these early humans expanded their range to Eurasia 60,000 years ago, where they replaced local hominins with a minuscule amount of interbreeding.

A hallmark of H. heidelbergensis — the potential common ancestor of Neanderthals, Denisovans and modern humans — is that individuals have a mixture of primitive and modern features. Like more archaic lineages, H. heidelbergensis has a massive brow ridge and no chin. But it also resembles H. sapiens, with its smaller teeth and bigger braincase. Most researchers have viewed H. heidelbergensis — or something similar — as a transitional form between H. erectus and H. sapiens.

Unfortunately, fossil evidence from this period, the dawn of the human race, is scarce and often ambiguous. It is the least understood episode in human evolution, says Russell Ciochon, a palaeoanthropologist at the University of Iowa in Iowa City. “But it's central to our understanding of humanity's ultimate origin.”

The tale is further muddled by Chinese fossils analysed over the past four decades, which cast doubt over the linear progression from African H. erectus to modern humans. They show that, between roughly 900,000 and 125,000 years ago, east Asia was teeming with hominins endowed with features that would place them somewhere between H. erectus and H. sapiens, says Wu (see ‘Ancient human sites’).

“Those fossils are a big mystery,” says Ciochon. “They clearly represent more advanced species than H. erectus, but nobody knows what they are because they don't seem to fit into any categories we know.”

The fossils' transitional characteristics have prompted researchers such as Stringer to lump them with H. heidelbergensis. Because the oldest of these forms, two skulls uncovered in Yunxian in Hubei province, date back 900,000 years1, 2, Stringer even suggests that H. heidelbergensis might have originated in Asia and then spread to other continents.

But many researchers, including most Chinese palaeontologists, contend that the materials from China are different from European and African H. heidelbergensis fossils, despite some apparent similarities. One nearly complete skull unearthed at Dali in Shaanxi province and dated to 250,000 years ago, has a bigger braincase, a shorter face and a lower cheekbone than most H. heidelbergensis specimens3, suggesting that the species was more advanced.

Such transitional forms persisted for hundreds of thousands of years in China, until species appeared with such modern traits that some researchers have classified them as H. sapiens. One of the most recent of these is represented by two teeth and a lower jawbone, dating to about 100,000 years ago, unearthed in 2007 by IVPP palaeoanthropologist Liu Wu and his colleagues4. Discovered in Zhirendong, a cave in Guangxi province, the jaw has a classic modern-human appearance, but retains some archaic features of Peking Man, such as a more robust build and a less-protruding chin.

Most Chinese palaeontologists — and a few ardent supporters from the West — think that the transitional fossils are evidence that Peking Man was an ancestor of modern Asian people. In this model, known as multiregionalism or continuity with hybridization, hominins descended from H. erectus in Asia interbred with incoming groups from Africa and other parts of Eurasia, and their progeny gave rise to the ancestors of modern east Asians, says Wu.

Support for this idea also comes from artefacts in China. In Europe and Africa, stone tools changed markedly over time, but hominins in China used the same type of simple stone instruments from about 1.7 million years ago to 10,000 years ago. According to Gao Xing, an archaeologist at the IVPP, this suggests that local hominins evolved continuously, with little influence from outside populations.

Politics at play?
Some Western researchers suggest that there is a hint of nationalism in Chinese palaeontologists' support for continuity. “The Chinese — they do not accept the idea that H. sapiens evolved in Africa,” says one researcher. “They want everything to come from China.”

Chinese researchers reject such allegations. “This has nothing to do with nationalism,” says Wu. It's all about the evidence — the transitional fossils and archaeological artefacts, he says. “Everything points to continuous evolution in China from H. erectus to modern human.”

But the continuity-with-hybridization model is countered by overwhelming genetic data that point to Africa as the wellspring of modern humans. Studies of Chinese populations show that 97.4% of their genetic make-up is from ancestral modern humans from Africa, with the rest coming from extinct forms such as Neanderthals and Denisovans5. “If there had been significant contributions from Chinese H. erectus, they would show up in the genetic data,” says Li Hui, a population geneticist at Fudan University in Shanghai. Wu counters that the genetic contribution from archaic hominins in China could have been missed because no DNA has yet been recovered from them.

Many researchers say that there are ways to explain the existing Asian fossils without resorting to continuity with hybridization. The Zhirendong hominins, for instance, could represent an exodus of early modern humans from Africa between 120,000 and 80,000 years ago. Instead of remaining in the Levant in the Middle East, as was thought previously, these people could have expanded into east Asia, says Michael Petraglia, an archaeologist at the University of Oxford, UK.

Other evidence backs up this hypothesis: excavations at a cave in Daoxian in China's Hunan province have yielded 47 fossil teeth so modern-looking that they could have come from the mouths of people today. But the fossils are at least 80,000 years old, and perhaps 120,000 years old, Liu and his colleagues reported last year6. “Those early migrants may have interbred with archaic populations along the way or in Asia, which could explain Zhirendong people's primitive traits,” says Petraglia.

S. Xing and X-J. Wu
Dozens of teeth from a cave in Daoxian, China, have been attributed to modern humans and date to 120,000–80,000 years ago.

Another possibility is that some of the Chinese fossils, including the Dali skull, represent the mysterious Denisovans, a species identified from Siberian fossils that are more than 40,000 years old. Palaeontologists don't know what the Denisovans looked like, but studies of DNA recovered from their teeth and bones indicate that this ancient population contributed to the genomes of modern humans, especially Australian Aborigines, Papua New Guineans and Polynesians — suggesting that Denisovans might have roamed Asia.

María Martinón-Torres, a palaeoanthropologist at University College London, is among those who proposed that some of the Chinese hominins were Denisovans. She worked with IVPP researchers on an analysis7, published last year, of a fossil assemblage uncovered at Xujiayao in Hebei province — including partial jaws and nine teeth dated to 125,000–100,000 years ago. The molar teeth are massive, with very robust roots and complex grooves, reminiscent of those from Denisovans, she says.

A third idea is even more radical. It emerged when Martinón-Torres and her colleagues compared more than 5,000 fossil teeth from around the world: the team found that Eurasian specimens are more similar to each other than to African ones8. That work and more recent interpretations of fossil skulls suggest that Eurasian hominins evolved separately from African ones for a long stretch of time. The researchers propose that the first hominins that left Africa 1.8 million years ago were the eventual source of modern humans. Their descendants mostly settled in the Middle East, where the climate was favourable, and then produced waves of transitional hominins that spread elsewhere. One Eurasian group went to Indonesia, another gave rise to Neanderthals and Denisovans, and a third ventured back into Africa and evolved into H. sapiens, which later spread throughout the world. In this model, modern humans evolved in Africa, but their immediate ancestor originated in the Middle East.

Not everybody is convinced. “Fossil interpretations are notoriously problematic,” says Svante Pääbo, a palaeogeneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. But DNA from Eurasian fossils dating to the start of the human race could help to reveal which story — or combination — is correct. China is now making a push in that direction. Qiaomei Fu, a palaeogeneticist who did her PhD with Pääbo, returned home last year to establish a lab to extract and sequence ancient DNA at the IVPP. One of her immediate goals is to see whether some of the Chinese fossils belong to the mysterious Denisovan group. The prominent molar teeth from Xujiayao will be an early target. “I think we have a prime suspect here,” she says.

Fuzzy picture

Despite the different interpretations of the Chinese fossil record, everybody agrees that the evolutionary tale in Asia is much more interesting than people appreciated before. But the details remain fuzzy, because so few researchers have excavated in Asia.

When they have, the results have been startling. In 2003, a dig on Flores island in Indonesia turned up a diminutive hominin9, which researchers named Homo floresiensis and dubbed the hobbit. With its odd assortment of features, the creature still provokes debate about whether it is a dwarfed form of H. erectus or some more primitive lineage that made it all the way from Africa to southeast Asia and lived until as recently as 60,000 years ago. Last month, more surprises emerged from Flores, where researchers found the remains of a hobbit-like hominin in rocks about 700,000 years old10.

Recovering more fossils from all parts of Asia will clearly help to fill in the gaps. Many palaeoanthropologists also call for better access to existing materials. Most Chinese fossils — including some of the finest specimens, such as the Yunxian and Dali skulls — are accessible only to a handful of Chinese palaeontologists and their collaborators. “To make them available for general studies, with replicas or CT scans, would be fantastic,” says Stringer. Moreover, fossil sites should be dated much more rigorously, preferably by multiple methods, researchers say.

But all agree that Asia — the largest continent on Earth — has a lot more to offer in terms of unravelling the human story. “The centre of gravity,” says Petraglia, “is shifting eastward.”

Interesting info indeed although it just gives you an idea without going deep.
Last edited:
More info. I encourage those interested in human origins (how we came to be human) to read "Making Sense of Genes" by Kostas Kampourakis pretty quick to get the latest lowdown on genetics research. THEN move to Wolpoff's "Race and Human Evolution: A Fatal Attraction" by Wolpoff, and THEN to "The Dopinergic Mind in Human Evolution and History" which is so loaded with clear data on every page that it's rather breathtaking! It's not often such a book gets written!!! Maybe they thought the title would be off-putting. Here are some samples, first, about the author:

What does it mean to be human? There are many theories of the evolution of human behavior which seek to explain how our brains evolved to support our unique abilities and personalities. Most of these have focused on the role of brain size or specific genetic adaptations of the brain. In contrast, Fred Previc presents a provocative theory that high levels of dopamine, the most widely studied neurotransmitter, account for all major aspects of modern human behavior. He further emphasizes the role of epigenetic rather than genetic factors in the rise of dopamine. Previc contrasts the great achievements of the dopaminergic mind with the harmful effects of rising dopamine levels in modern societies and concludes with a critical examination of whether the dopaminergic mind that has evolved in humans is still adaptive to the health of humans and to the planet in general.

Fred H. Previc is currently a science teacher at the Eleanor Kolitz Academy in San Antonio, Texas. For over twenty years, he was a researcher at the United States Air Force Research Laboratory where he researched laser bioeffects, spatial disorientation in flight, and various topics in sensory psychology, physiological psychology, and cognitive neuroscience. Dr. Previc has written numerous articles on the origins of brain lateralization, the neuropsychology of 3-D space, the origins of human intelligence, the neurochemical basis of performance in extreme environments, and the neuropsychology of religion.

Now, the reason I suggest the order of reading above that Previc apparently did not read Wolpoff and is pretty much sold on the Recent Out of Africa theory which Wolpoff demolishes. He's also sold on the "Seven Daughters of Eve" business, so that is a weakness. But if you can mentally correct those errors as you read, you'll find such a load of interesting neuropsychological information in this book that you'll be having insights on almost every page. Previc also, apparently, is not familiar with the Neoteny theory; had he been, he could have filled in some blanks in his theory. In any event, it's well worth reading just for better understanding of the brain and how it may work as a mediator of information/non-physical reality.

From Chapter 1:

Between two and three million years ago, a small creature hardly larger than a pygmy chimpanzee but with a much larger brain relative to its body weight began a remarkable journey. The initial part of that journey didn’t involve much by today’s standards, merely the ability to scavenge and possibly chase-hunt the creatures of the sub-Saharan African savannahs, to make some rather modest stone-flaked tools for that purpose, and eventually to migrate over the African and possibly the Eurasian land mass. This little creature, arguably our first unequivocally human ancestor, was known as Homo habilis (“domestic” man). How the modest abilities of this first human emerged and were transformed into the prodigious human achievements and civilization that exist today is arguably the most important scientific mystery of all. The solution to this mystery will not only help to explain where and why we evolved as we did - it will additionally shed light on how we may continue to evolve in the future.

But, first, some basic questions must be asked, including: what is human nature and what is the basis of it? How much of human nature is related to our genes? Is human nature related to the size and shape or lateralization of our brain? How did human nature evolve? Although our hairless skin and elongated body make our appearance quite different from our primate cousins, it is not our anatomy but our unique brain and behavior that most people consider special. Typical behaviors considered uniquely human include propositional (grammatical) language, mathematics, advanced tool use, art, music, religion, and judging the intent of others. However, outside of religion, which has yet to be documented in any other extant species, at least one other - and, in some cases, several - advanced species have been shown to possess one or more of the above traits. For example, dolphins understand and can use simple grammar in their contact with humans (Herman, 1986) and probably use even more sophisticated grammar in their own ultrasonic communications. Certain avian species such as parrots can count up to ten (Pepperberg, 1990) and, like apes, use mathematical concepts such as similarity and transitivity (Lock and Colombo, 1996). Orangutans display highly advanced tool use, including the preparation of tools for use in procuring food (van Schaik, 2006). As regards music and art, singing is a highly developed and plastic form of communication in songbirds (Prather and Mooney, 2004), apes have proven to be adept musical instrumentalists in their drumming (Fitch, 2006), and elephants and
chimpanzees have been known to create realistic and abstract paintings.[1] Finally, chimpanzees (but not monkeys) are able to determine the mental states of others and to engage in mirror self-recognition (Lock and Colombo, 1996), attributes normally considered part of a general mental capability known as the “theory of mind” (see later chapters).

What mostly defines humans, then, is not a unique ability to engage in a particular behavior but rather the way in which we perform it. Three features of human behavior are particularly salient: its context-independence, its generativity, and its degree of abstraction. Context-independent cognition, emphasized in the comparative analysis of Lock and Colombo (1996), refers to the ability to perform mental operations on new and different types of information in different settings. The behavior of chimpanzees may be viewed as much more contextually dependent than that of humans because it differs considerably depending on whether they are in the wild or in captivity; in the wild, for example, chimpanzees are relatively more likely to use tools but less likely to use symbols (Lock and Colombo, 1996). Generativity refers to the incredible amount of and variety of human cognitive output - whether it be in the tens of thousands of words in a typical language’s lexicon, the almost limitless varieties of song and paintings, or the incredible technological progress that has continued largely unabated from the end of the Middle Stone Age to the present. Finally, the abstract nature of human cognition, similar to what Bickerton (1995) has referred to as “off-line” thinking and what Sud- dendorf and Corballis (1997) term “mental time travel,” strikingly sets humans apart from all other species, which engage largely in the present. While some species can use symbols, only humans can create abstract ones like numbers, words, and religious icons, and it is difficult to conceive of even such advanced creatures as chimpanzees and dolphins as going beyond a simple emotional concept of death or the fulfillment of a current motivationally driven state to such spatially and temporally distant religious concepts as heaven and eternity. Indeed, apes spend the vast majority of their waking lives in immediate, nearby activities (eating and grooming) (see Bortz, 1985; Whiten, 1990), and even Neanderthals appear to have been more constrained in their spatial and temporal mental spheres (Wynn and Coolidge, 2004).

There are two major features that characterize all of the advanced cognitive skills in humans:

  1. they all appear to have first emerged between 50,000 and 80,000 years ago, first in Africa and later in Europe and elsewhere;
  2. and the context-independent, generative, and abstract expressions of these skills require high levels of a critical neurotransmitter in the brain known as dopamine.

Hence, the emergence of intellectually modern humans around 80,000 years ago arguably represented the beginning of what I will refer to as the “dopaminergic mind.” How that mind depends on dopamine, how it came to evolutionary fruition, and the dangers its continued evolution pose for the denizens of industrialized societies in particular will all be discussed in later chapters of this book. First, however, I attempt to refute commonly held explanations (myths) of how human nature evolved. The first myth is that the evolution of human intelligence was primarily a product of genetic selection, while the second is that the specific size, shape, or lateralization of our brain is critical for us to be considered human.

Myths concerning the origins of human behavior

Was human intelligence genetically selected?

There are many reasons to believe that the origin of advanced human behavior was at least partly controlled by genetic evolution. For one, estimates of the heritability of intelligence, based largely on twin studies that compare the concordance (similarity) of identical twins (which share the same genome) to fraternal twins (which only share the same genetic makeup as regular siblings), are around 0.50 (see Dickens and Flynn, 2001). There are also genetic differences between chimpanzees and modern humans on the order of about 1.2 percent (Carroll, 2003), which in principle could allow for selection for particular genes that may have helped produce the intellectual capabilities of modern humans. Certainly, advanced intelligence should help members of a species survive and reproduce, which according to Darwinian mechanisms should allow that trait to be passed on genetically to offspring. Indeed, it is highly likely that some genetic changes at least indirectly helped to advance human intelligence, although I will argue in Chapter 5 that most of these were probably associated with an overall physiological adaptation that occurred with the dawn of Homo habilis.

There are more compelling reasons, though, to believe that advanced human intellectual abilities are not primarily due to genetic selection. First of all, genetic expression and transmission have been documented to be modifiable at many levels by a wide variety of influences (especially maternal) that can themselves be passed to offspring in a mode known as “epigenetic inheritance” (Harper, 2005). Indeed, there are ongoing major increases in intelligence (Dickens and Flynn, 2001) and various clinical disorders (Previc, 2007) in the industrialized societies that are occurring despite stable or even opposing genetic influences. For example, the prevalence of autism, characterized by severely deficient social and communication skills, is dramatically increasing despite the fact that most individuals with autism never marry and thereby pass on their genes (see Chapter 4). Second, heritability estimates for intelligence and many other normal and abnormal traits may be overblown because fraternal twins do not share as similar a prenatal environment (a major source of epigenetic inheritance) as most identical twins due to the lack of a shared blood supply (Prescott etal., 1999) and because of the greater similarity of rearing in identical twins (Mandler, 2001). Third, dramatic changes in physiology, anatomy, and behavior are believed to occur when the timing of gene expression is affected by disturbances in key regulatory or hormonal centers such as the thyroid (Crockford, 2002; McNamara, 1995). Fourth, anatomical findings (McDougall et al., 2005) and genetic clock data (Cann et al., 1987; Hammer, 1995; Templeton, 2002; von Haeseler etal., 1996) clearly place the most recent ancestor common to all modern humans at around years,[1] yet there is little or no evidence of art, music, religion, beads, bone tools, fishing, mining, or any other advanced human endeavors until more than 100,000 years later (McBrearty and Brooks, 2000; Mellars, 2006; Shea, 2003). One hundred thousand years may not seem like a large amount of time, in that it only constitutes about 5 percent of the total time elapsed from the appearance of Homo habilis, but it is more than ten times longer than from the dawn of the most ancient civilization to the present. Finally, there is no convincing evidence that genetic factors have played any role whatsoever in one of the most striking of all human features - the functional lateralization of the brain (Previc, 1991).

Although it still remains to be determined exactly how many genes humans actually have, the current best estimate is around 20,000-25,000. Given the 1.2 percent genetic divergence between chimpanzees (our genetically closest living relative) and modern humans, there would first appear to be a sufficient amount of discrepant genetic material to account for neurobehavioral differences between us and our nearest primate relation. However, the vast majority of our genome appears to be nonfunctional “junk” DNA and most of the remaining DNA is involved in gene regulation, with only a tiny percentage of the total DNA (<1.5 percent) actually used in transcribing the amino acid sequences that create proteins (Carroll, 2003). The “coded” sections of the human genome also appear to show less variation between humans and apes than the “non-coded” sections (Carroll, 2003; Mandel, 1996), and much of that difference relates to genes for the protein-intensive olfactory system.[2] In fact, there is no evidence that any proteins, receptors, neurotransmitters, or other components of our basic neural machinery do not exist in chimpanzees (Rakic, 1996). Rather, most of the different genetic sequencing between chimpanzees and humans is in regulatory sections of the genome that affect gene expression (Carroll, 2003), presumably including those that affect brain and body development conjointly. As but one example, there are many genes that affect calcium production, which in turn helps regulate skeletal growth as well as the production of key brain transmitters (see Previc, 1999). Also, there are many genes that can affect the thyroid gland, which has an important influence on body metabolism, body growth, brain activity, and brain size and is arguably a major force for speciation during evolution (Crockford, 2002) and one of the few endocrine structures known to have altered its function during human evolution (Gagneux et al., 2001; Previc, 2002). It is likely, therefore, that changes in regulatory-gene activity and other factors that influence gene expression played some role in the evolution of humans, most probably in its earliest stages (see Chapter 5).[1]

To say that there may have been some influences on gene regulation in humans during the course of our evolution is more defensible than the notion that specific genes or sets of genes determine advanced human capabilities. Rarely does a single gene or small set of genes affect a major brain or non-brain disease, and higher cognitive capacities involve even more genes (Carroll, 2003). For example, the combined variance accounted for by several key genes known to contribute to intelligence and to various clinical disorders in humans is less than 10 percent (Comings et al., 1996). The polygenic nature of higher cognition is not surprising when one considers the many cognitive skills - discussed in much greater detail by Previc (1999) and in Chapter 3 - that are required for listening to, comprehending, and responding appropriately to a simple sentence such as “Build it and they will come.” First, a motor system must recreate in our own minds what is being said; second, an incredibly rapid auditory processor must decode a multitude of acoustic transients and phonemes; third, a capability for abstraction serves to link the spoken words to their correct meaning; fourth, working memory is required to keep the first clause of the sentence in mind as we await the final one; fifth, cognitive flexibility is needed to realize that, after hearing the second part of the sentence, the first part isn’t about construction but expresses a more profound thought; sixth, an ability to judge speaker intent aids in further recognizing that this sentence as spoken by a particular individual (e.g. a philosopher) is not about construction; and finally, there must be an ability to assemble and correctly sequence a collection of phonemes that provides a spoken response that we (or any other individual) may have never uttered before. Despite all of this, some researchers such as Pinker and Bloom (1990) have postulated that a single gene or small set of genes may have mutated to create specific language capabilities (e.g. grammar) only found in humans. Indeed, there was great excitement among the scientific world that a “grammar gene” had been identified in a small English family of supposedly grammar-deficient individuals (Gopnik, 1990), who were later shown to have a mutation of a gene known as “Foxp2” (Lai et al., 2001). There eventually turned out to be several major problems with this finding, however. The first was that the affected family members did not have a selective loss of grammar but rather exhibited many other language problems as well as severe speech articulation difficulties, an inability to carry out simple facial gestures (like winking), behavioral disorders such as autism, and even nonlinguistic cognitive deficits (their average nonverbal intelligence quotient was found to be only eighty-six, or fourteen points below their unaffected relatives) (Vargha-Khadem et al., 1995). Moreover, the Foxp2 gene mutation turns out not to be associated with the deficits exhibited by most individuals with specific language impairments (Newbury et al., 2002), nor does the human Foxp2 gene resemble that of other species (e.g. avians and dolphins) who possess advanced vocal communication skills (Webb and Zhang,

The final factor mitigating the importance of the Foxp2 gene in human linguistic evolution comes from a recent DNA finding in Neanderthals, from whom the ancestors of modern humans diverged nearly 400,000 years ago. At least one of the two major variants of the modern human Foxp2 gene relative to that of chimpanzees was once thought to have occurred as recently as 10,000 years ago (Enard et al., 2002), or long after the emergence of the common human genome. However, an analysis of the DNA of Neanderthals shows that they, too, possessed both modern human Foxp2 variants (Krause et al., 2007), which indicates that these variants must be at least 400,000 years old given the estimated date of divergence of the Neanderthal and modern human lineages (Chapter 5).

Another phenomenon tied to the evolution of humans is the lateralization of the human brain for advanced cognitive functions. Two of the most well-known manifestations of cerebral lateralization are the overwhelming and universal preponderance of right-handedness in humans - about 85-90 percent of individuals in Western societies exhibit some form of right motor dominance - and the greater likelihood of suffering serious speech and language deficits (known as aphasias) following damage to the left hemisphere in adulthood.[1] Although brain lateralization of some sort or another is common in the animal world, the degree of functional lateralization of the human brain is remarkable compared to that of other mammalian brains and especially that of the chimpanzee. Indeed, one of the great triumphs of modern neuroscience was the demonstration, mainly through studies of “split-brain” patients in which the connections between the hemispheres were severed to relieve epilepsy (Gazzaniga, 2005), that the left and right hemispheres of most humans not only differ in their linguistic capabilities but also possess very different personalities (the left is more active, controlling, and emotionally detached while the right is more earthy and emotional) and intellects (the left is more analytical, abstract, and future-oriented while the right one is more concrete, better at judging emotional and mental states, and better at visual manipulations, especially 3-D geometrical ones in body space). Indeed, the cognitive and personality differences between the left and right hemispheres of most humans are greater than between almost any two humans, and the specialized functions of the left hemisphere arguably render it almost as dissimilar to those of the right hemisphere as human intellectual functions in general differ from chimpanzees.[1]

Although many theorists such as Annett (1985) and Crow (2000) have posited that left-hemispheric language dominance is largely determined by a single gene - and despite evidence that, at least in some species, the overall direction of body asymmetry is subject to genetic influences (Ruiz-Lozano et al., 2000) - the evidence is strongly against a genetic explanation for brain lateralization in humans. First, the likelihood of one member of a twin pair having the same hand dominance as the other is no greater for identical than for fraternal twins (Reiss et al., 1999),[2] and speech dominance for monozygotic twin pairs shows a similarly weak concordance (Jancke and Steinmetz, 1994). Second, neither handedness nor speech lateralization (see Tanaka et al., 1999; Woods, 1986) appears to be related to the genetically influenced asymmetrical position of the major body organs such as the heart, which, in any case, is the same in humans as in chimpanzees. Third, there does not appear to be any evolutionary advantage conferred by the typical pattern of left-hemispheric dominance for handedness, as left-handers and right-handers on average do not differ in academic or athletic achievement or any other personality variables (see Hardyck et al., 1976; Peters et al., 2006; Previc, 1991), although there may be very slight deficits for some individuals with ambiguous dominance (Peters et al.,[3] Fourth, the development of cerebral lateralization is heavily dependent on both cultural and prenatal factors. As an example of cultural factors, aphasia following left-hemispheric damage was very uncommon a few centuries ago in Europe when the vast majority of adults were illiterate and not exposed to the left-right reading and writing of Western languages, and right-handedness remains much less prevalent in existing illiterate populations (see Previc, 1991). As an example of prenatal factors, handedness and other forms of motoric lateralization are greatly reduced in otherwise normal infants born before the beginning of the third trimester and are affected by fetal positioning in the final trimester, which may be crucial as a source of early asymmetrical motion experience in bipedal humans (Previc, 1991). Indeed, the entire edifice of human laterality may be based primarily on primordial prenatal (i.e. nongenetic) factors (Previc, 1991).

Finally, the notion that language and language-linked brain lateralization are determined genetically is contradicted by the nature of human language as a very robust behavior that does not depend on a particular sensory modality (e.g. hearing) or motor system (e.g. speech). For example, individuals who cannot speak or move their hands can communicate with their feet, and those who cannot hear or see can use their hands to receive messages. Humans have invented languages dependent on speech sounds but also on manual signs, tactile signals, fundamental (musical) frequencies, visual icons, clicks, whistles, and probably other signals as well, all demanding many of the same skills described above for speech comprehension and production. Moreover, the mechanisms of language have expropriated the same systems used in more basic motor functions such as chewing, hand movements and eye movements, the latter two of which accompany linguistic thought (Kelso and Tuller, 1984; Kingston, 1990; McGuigan, 1966; Previc et al., 2005). And, the fact that speech is housed mostly in the left hemisphere of humans certainly doesn’t imply a causal (or more specifically, a genetic) linkage because the loss in early life of the left hemisphere does not affect subsequent language ability in any measurable way (see next section). Indeed, a pure “language” gene/protein would have to be a strange one in that it would have to:
  1. affect language at a superordinate level, independent of any particular sensorimotor modality;
  2. affect one hemisphere more than another, even though the lateralization process does not appear to be under genetic control
  3. and even though language proceeds just fine in the absence of the originally favored hemisphere;
  4. and affect no other sensorimotor or cognitive systems, even though these other systems are closely tied to language processing and output and are, in some case, necessary for language to occur.

Needless to say, no pure language gene has been found or is likely to ever be found.

In summary, a direct, major role of direct genetic selection in language and other higher-order cognitive functions is unlikely. This is consistent with the fact that all major intellectual advances during human evolution proceeded in sub-Saharan Africa (McBrearty and Brooks, 2000; Previc, 1999), even though ancestral humans had populated wide swaths of Africa, Europe, and Asia nearly two million years ago. If cognitive ability and not physiological and dietary adaptations - which occurred mostly in sub-Saharan Africa, for reasons to be discussed in Chapter 5 - was the major trait genetically selected for, then why were the other regions of the world in which cognitive ability would have also proven beneficial unable to rival sub-Saharan Africa as the cradle of human evolution?

Did our larger brains make us more intelligent?

The second “myth” concerning human evolution - that we got smarter mainly because our brains got bigger - remains very popular, even among researchers in the field. Yet, there are even more powerful arguments against this view than against the genetic selection theory. After all, elephants by far have bigger brains than anyone else in the animal kingdom, yet most would not be considered intellectual giants; conversely, birds have very small brains (hence, the derogatory term “bird-brain”), but we now realize that some bird species (e.g. parrots) actually possess relatively advanced cognitive capacities, such as language, arithmetic, and reasoning skills (Pepperberg, 1990).

Accordingly, most brain scientists accept that a better measure than brain size for predicting intelligence is brain-to-body weight; using this measure, humans fare very well, along with other creatures that we might consider intelligent (chimpanzees, dolphins, parrots). However, there are problems even with this measure, because the lowly tree shrew - a small, energetic creature that was an early ancestor of primates such as monkeys but is hardly noted for its intellectual prowess - ranks above all others in brain-body ratio (Henneberg, 1998). Moreover, the correlation between brain/body size and intelligence in humans has generally been shown to be very modest, with a typical coefficient that is barely more than the correlation between height and intelligence (~0.3) (see Previc, 1999). Since no researchers have claimed that height is causally related to intelligence, there is no reason to assume that the equally modest relationship between brain size and intelligence is also causally related. Moreover, when examining the relationship between brain size and intelligence within families to control for dietary and other environmental differences that differ among families, the correlation becomes essentially random (Schoenemann et al., 2000). Indeed, there are even among humans of normal body sizes great variations in brain size, ranging normally from 1,000 cc to over 1,500 cc, and some of the most brilliant minds throughout history have actually had estimated brain sizes toward the low end of that range. The Nobel prize-winning novelist Anatole France had a brain size of only 1,000 g - about the same as the human ancestor Homo erectus, who lived over a million years ago - and most individuals with microcephaly (extremely small brains) without other associated disorders such as Down’s syndrome, growth retardation etc. tend to be of normal intelligence (see Skoyles, 1999, for a review). For example, one well-studied young mother given the moniker “C2” is estimated to have a brain size around 740 cc (at the low end of the Homo erectus range), despite an estimated intelligence quotient (IQ) of 112 (above that of the average human). Finally, the importance of brain size to intelligence is dubious from an evolutionary perspective, in that most of the increase in brain-to-body size in humans over the past million years is explained not by an increase in brain size but rather by a decrease in the size of our digestive tract that was arguably made possible by the reduced digestive demands associated with the increased consumption of meat and the cooking of plant foods (Hen- neberg, 1998). Ironically, the average human brain actually shrank over the past 100,000 years or so from 1,500 cc to 1,350 cc (Carroll, 2003), despite the aforementioned explosion of human intellectual capability (see Chapter 5).

Consequently, researchers have used yet another measure that compares the relative size of different structures such as the cerebral cortex - the outer, mostly gray mantle or “bark” of our brains, on which most of our higher-order cognitive capacities depend (hence, the positive connotations of having lots of “gray matter”) - relative to the brain distribution for an insectivore such as the tree shrew. By this measure, also known as the progression index, the human neocortex is by some accounts effectively 2.5 times larger than the chimpanzee’s relative to the rest of the brain (Rapoport, 1990), although this has been disputed (Holloway, 2002). Even more strikingly, the area of the neocortex associated with mental reasoning - the prefrontal lobes - occupy 29 percent of the neocortex of humans but only 17 percent of that of the chimpanzee (Rapoport, 1990), although a more recent review argued for no difference in relative frontal-lobe sizes between humans and other apes (Semendeferi et al., 2002). At first glance, this suggests that the size of at least one portion of our brain may indeed account for the intellectual advancement of humans relative to the great apes.

However, the attribution of human intelligence to a larger neocortex or larger prefrontal cortex in particular is as erroneous as the notion that overall brain size is relevant to human intelligence. For one, an individual by the name of Daniel Lyon who had a brain of only 624 cc but essentially normal intellect is believed to have had an especially small cerebral cortex relative to the size of the rest of his brain (Skoyles, 1999). Moreover, research has shown that removal of only the prefrontal cortex in infancy produces no long-term deficits in intelligence in monkeys (Tucker and Kling, 1969). In fact, removal of major portions of the left hemisphere in infancy produces remarkably few long-term linguistic and other intellectual decrements (de Bode and Curtiss, 2000), even though in normal adult humans the left hemisphere is the predominant hemisphere for most linguistic and analytic intelligence. One child who received a hemispherectomy even ended up with above-average language skills and intelligence, despite the fact that his entire left hemisphere had been removed at the relatively late age of five-and-a-half years and he had disturbed vision and motor skills on one side of his body due to the surgery (Smith and Sugar, 1975). Even more striking is the finding of Lorber (1983), who described many cases of children with hydrocephalus, which occurs when overproduction of cerebrospinal fluid in the center of the brain puts pressure on the cerebral cortex and, if left untreated, eventually compresses and damages it. In one of Lorber’s most dramatic cases, a child with only 10 percent of his cerebral cortical mantle remaining eventually ended up with an overall intelligence of 130 (in the genius range), with a special brilliance in mathematics (Lorber). Somewhat cheekily, Lorber went so far as to entitle his famous chapter “Is your brain really necessary?”

The failure of large-scale cortical removal in infants and young children to substantially affect subsequent intelligence is complemented by the dramatically different intelligences that exist for similarly sized brains. The most striking example of this involves the left and right hemispheres, which are almost identical in weight and shape, aside from a few minor anatomical differences that appear to have little functional significance (see Previc, 1991). Yet, as noted earlier, it would be hard to imagine two more different intellects and personalities. The left hemisphere is impressive at abstract reasoning, mathematics, and most language functions, yet it has difficulty in interpreting simple metaphors and proverbs, in judging the intent of others, and in performing other simple social tasks. By contrast, the right hemisphere is poor at most language functions (it has the grammar of a six-year-old and the vocabulary of an eleven-year-old) and does poorly on logical reasoning tests, yet it is superior to the left hemisphere at proverb interpretation, understanding the intent of others, self-awareness, emotional processing, social interaction, certain musical tasks, and 3-D geometry (Gazzaniga,

. Another important example of the stark contrast between an anatomically normal brain and severe abnormalities in higher mental functioning involves the disorder known as phenylketonuria. In this genetic disorder, the enzyme phenylalanine hydroxylase is absent and unable to convert phenylalanine to tyrosine (the precursor to the neurotransmitter dopamine), resulting in a buildup of pyruvic acid and a decrease in tyrosine (as well as dopamine). Because these problems only emerge after birth, when the basic size and shape of the brain has already been established, the brains of persons suffering from PKU appear grossly normal, even though those with PKU suffer severe mental retardation if their excess phenylalanine is not treated by dietary restrictions.

In conclusion, there are compelling reasons to reject as myth the standard view that the evolution of human intelligence and other advanced faculties was determined by direct genetic influences that conspired to change the size and shape of the human brain. On the basis of his own findings with hydrocephalic children, Lorber (1983: 12) concluded that there is an “urgent need to think afresh and differently about the function of the human brain that would necessitate a major change in the neurological sciences.” Unfortunately, the revolution in the perspective of the neuroscientific community at large has yet to occur.

The evolution of human intelligence: an alternative view

The pervasiveness of the myth that the ability of humans to think and create in advanced ways is dependent on the overall size of our brains is surprising in that few of us would automatically conclude that a bigger computer is a better one. Indeed, some of the massive early computers had less than one-billionth the speed and capacity of current notebook computers. Rather, it is how the system works - a collective product of such functions as internal speed, amount of parallel processing etc., known as its “functional architecture” - that largely determines its performance. In fact, by any stretch of the imagination, our brain is far larger than it would ever have to be to perform the advanced intellectual functions that we evolved. For example, the number of nerve cells in it (100 billion, as a generally accepted estimate) times the average number of connections per nerve cell (10,000, another generally accepted estimate) times the number of firings per second (up to 1,000, for rapidly firing neurons) allows our brain to perform a comparable number of calculations (1018) as our very best state-of-the-art computers. While using but a fraction of their hardware capabilities, such computers can crunch massive numbers in microseconds, generate real-world scenes in milliseconds, understand the complex syntax of language, and play chess better than the greatest of humans.

What, then, is the essence of why we are so intelligent relative to the rest of the animal world, and especially other primates? Perhaps the most important clue - indeed, the “Rosetta Stone” of the brain - lies in the differences between the left and right hemispheres of the human brain.[1] As already noted, it is our left hemisphere and its grammatical, mathematical, and logical reasoning skills that most obviously differentiates our intellectual capability from that of chimpanzees, despite the comparable size and overall shape of its right hemispheric counterpart and the lack of a known gene underlying its advanced intellect. The right hemisphere is marginally heavier and the left hemisphere may have slightly more gray matter, but the left-right differences are far smaller than between brains of two different humans. There also typically exist larger right frontal (anterior) and left occipital (posterior) protrusions, as if the brain was slightly torqued to the right, as well as some differences in the size, shape, and neural connections of the ventral prefrontal and temporal-parietal prefrontal regions of cortex that house, among other things, the anterior and posterior speech centers (Previc, 1991). However, there is evidently no functional significance to the torque to the right, and the other changes do not appear to be causally linked to the functional lateralization of the brain, since they generally occur after the process of cerebral lateralization is well underway (see Previc, 1991).

A much more likely candidate for why the left and right hemispheres differ so much in their functions is the relative predominance of four major neurotransmitters that are used to communicate between neurons at junctures known as the synapses. The four most important lateralized neurotransmitters are dopamine, norepinephrine, serotonin, and acetylcholine.[1] On the basis of a wide variety of evidence (see Flor- Henry, 1986; Previc, 1996; Tucker and Williamson, 1984), it is generally accepted that dopamine and acetylcholine predominate in the left hemisphere and that norepinephrine and serotonin predominate in the right hemisphere. The latter two transmitters are heavily involved in arousal, which explains why the right hemisphere is generally more involved in emotional processing. So, that leaves dopamine and acetylcholine as the two most likely candidates for understanding why the left hemisphere of humans has evolved such an advanced intellect, with at least five suggestions that it is dopamine rather than acetylcholine that underlies human intelligence. Three of these pertain to how dopamine is distributed in the brain, and the other two of these pertain to the known role of dopamine in cognition.

The three major arguments for a role of dopamine in advanced intelligence are:

  1. dopamine is highly concentrated in all nonhuman species with advanced intelligence;
  2. only dopamine has expanded throughout primate and hominid evo- lution;[2] and
  3. dopamine is especially rich in the prefrontal cortex, the single-most important brain region involved in mathematics, reasoning, and planning.

Nonhuman brains of other advanced species such as parrots, dolphins, and primates differ in many fundamental respects, but they are similar in that all have relatively high amounts of dopamine (Previc, 1999). Most striking is the tiny overall size and almost absent cortical mass of birds, which nonetheless have a well-developed dopamine-rich striatum (Waldmann and Gunturkun, 1993) and nidopallium caudolaterale that support their impressive mathematical and linguistic competencies. Indeed, the latter structure is considered analogous to the mammalian prefrontal cortex because of its high percentage of input from subcortical dopamine regions and its critical role in cognitive shifting and goal- directed behavior (Gunturkun, 2005). As regards the second argument, the dopaminergic innervation of the cerebral cortex is known to have undergone a major expansion in primates; for example, dopamine is limited to specific brain areas in rodents but is densely represented in certain cortical layers in all regions of the monkey brain (see Gaspar et al., 1989). Moreover, the dopaminergic expansion has continued into humans, judging from the almost two-fold increase (adjusted for overall brain size) in the size of the human caudate nucleus - in which dopamine is most densely concentrated - relative to that of the chimpanzee caudate (Rapoport, 1990). By contrast, there is no evidence that any other transmitter has expanded as much, if at all, during human evolution, and the cholinergic content of the human cerebral cortex may have actually diminished (Perry and Perry, 1995). Finally, dopamine appears to be especially well-represented in the prefrontal cortex of humans, and chemical removal of dopamine from an otherwise intact prefrontal cortex essentially duplicates all of the intellectual deficits produced by outright damage to this region (Brozoski et al., 1979; Robbins, 2000). One major feature of the prefrontal cortex is its ability to recruit other cortical regions in performing parallel mental operations, which is likely to be dopami- nergically mediated because dopamine is well-represented in the upper layers of the cerebral cortex, where the connections to the other cortical regions mostly reside (Gaspar et al., 1989; Chapter 2).

Two other reasons linking dopamine with advanced intelligence are its direct roles in normal intelligence and in the intellectual deficits found in clinical disorders in which dopamine levels are reduced. Far more than acetylcholine, dopamine is involved in six major skills underlying advanced cognition: motor planning and execution; working memory (which allows us to engage in parallel processing because we can process and operate on different types of material at the same time); cognitive flexibility (mental shifting); temporal processing/speed; creativity; and spatial and temporal abstraction (see Chapter 3 for a greater elucidation). Working memory and cognitive flexibility are considered the two most important components of what is known as “executive intelligence,” and tasks that assess it have been shown by brain imaging to directly activate dopamine systems in the brain (Monchi et al., 2006). Enhanced parallel processing and processing speed, which modern computers rely on to achieve their impressive processing power, are particularly associated with high general intelligence in humans (Bates and Stough, 1998; Fry and Hale, 2000), as are dopamine levels themselves (Cropley et al., 2006; Guo et al., 2006). Second, dopamine is arguably the neurotransmitter most involved in the intellectual losses in a number of disorders including Parkinson’s disease and even normal aging, phenylketonuria, and iodine-deficiency syndrome (see Chapter 4). In phenylketonuria, for example, the genetically mediated absence of the phenylalanine hydroxylase enzyme prevents the synthesis of tyrosine, an intermediary substance in the synthesis of dopamine by the brain (Diamond et al., 1997; Welsh et al., 1990). Neurochemical imbalances rather than neuroanatomical abnormalities are believed to be the major cause (and basis of treatment) for almost every brain-related psychological disorder. In particular, either elevated or diminished dopamine contributes in varying degrees to Alzheimer’s disease and normal aging, attention- deficit disorder, autism, Huntington’s disease, iodine-deficiency syndrome (cretinism), mania, obsessive-compulsive disorder, Parkinson’s disease, phenylketonuria, schizophrenia, substance abuse, and Tourette’s syndrome, all of which are associated with changes in cognition, motor function, and/or motivation (see Previc, 1999; Chapter 4).

The importance of dopamine to these disorders partly explains why dopamine is by far the most widely studied neurotransmitter in the brain. For example, in the exhaustive Medline database listing studies involving different neurotransmitters, dopamine was the subject of over 60,000 brain articles through 2008, whereas serotonin - the next most- studied neurotransmitter and a substance that has important interactions with dopamine - was the subject of about 38,000 brain articles and acetylcholine (the other predominant left-hemispheric transmitter) was the subject of less than 17,000 brain papers.

The rise of dopamine during human evolution

If the principal reason for the uniqueness of human intellectual and other behavior is that the neurochemical balance in our brains favors dopamine, the remaining great question is how dopamine ended up being so plentiful in the human brain, especially in its left hemisphere. Certainly, there are no new genes that appeared in humans that control the production of the major dopamine synaptic receptors (Previc, 1999), and no variation in these genes within humans seems to strongly correlate with variations in intelligence (Ball et al., 1998). As will be discussed further in Chapter 5, it is more likely that dopamine levels may have been indirectly altered by genetic changes that affected calcium production, thyroid hormones, or some more general physiological mechanism or adaptation, given that both calcium and thyroid hormones are involved in the conversion of tyrosine to dopa (the immediate precursor to dopamine) in the brain. The stimulatory effect of thyroid hormones on both skeletal growth and dopamine metabolism, as well as the stimulatory effect of dopamine on growth hormone, are especially attractive mechanisms for explaining the triple convergence of intelligence, brain size, and skeletal height. The importance of the thyroid hormones is also suggested by the previously noted finding that elevated levels of thyroid hormone in humans represent the first confirmed endocrinological difference between chimpanzees and humans (Gagneaux et al., 2001).

As reviewed by Previc (1999, 2007), there are even more plausible nongenetic explanations for why dopamine levels increased during human evolution. As will be discussed further in Chapters 4 and 5, the most likely candidate for a nongenetic or epigenetic inheritance of high dopamine levels is the ability of maternal factors - specifically, the mother’s neurochemical balance - to influence levels of dopamine in offspring. Not only have dopaminergic systems been shown to be influenced by a host of maternal factors, but there is also compelling evidence that the neurochemical balance of humans has, in fact, changed in favor of increasing dopamine due to a combination of five major factors:

  1. a physiological adaptation to a thermally stressful environment (which requires dopamine to activate heat-loss mechanisms);
  2. increased meat and shellfish consumption (which led to greater supplies of dopamine precursors and conversion of them into dopamine);
  3. demographic pressures that increased competition for resources and rewarded dopaminergically mediated achievement-motivation;
  4. a switch to bipedalism, which led to asymmetric vestibular exposure in fetuses during maternal locomotion and resting and ultimately elevated dopamine in the left hemisphere of most humans (see Previc, 1991); and major increases in the adaptive value of dopaminergic traits such as achievement, conquest, aggression, and masculinity beginning with late-Neolithic societies.
The link between bipedalism and brain lateralization exemplifies how epigenetic transmission could have become a seemingly permanent part of our inheritance during our evolution. Bipedalism, together with asymmetric prenatal positioning, creates asymmetrical gravitoinertial forces impacting the fetus, which may in turn create asymmetrical vestibular functioning and neurochemical differences between the hemispheres (Previc, 1991, 1996). Although the ultimate cause of the neurochemical lateralization may be nongenetic, the switch to bipedal- ism was a permanent behavioral fixture so that the resulting cerebral lateralization would continue for all future generations of humans and superficially appear as it if had become part of the genome itself.

Before addressing the changes in human brain dopamine levels during evolution and history in Chapters 5 and 6, it will first be necessary to further detail the nature of dopaminergic systems in the human brain (Chapter 2) and dopamine’s role in normal and abnormal behavior (Chapters 3 and 4). Finally, Chapter 7 will discuss the consequences of the “dopaminergic mind” not only for humans but for other species. As part of that critique, the impressive accomplishments of the dopaminergic mind throughout history will be weighed against the damage it has caused to itself, to others, and to the Earth’s ecosytems. It will be concluded that for humans to prosper (and perhaps even survive), the dopaminergic imperative that propelled humans to such great heights and more than anything else defined us as humans must in the end be relinquished.
I'm nearly half-way through "Making Sense of Genes" by Kostas Kampourakis and it is completely fascinating. How little did I knew about this subject despite studying books on genetics for over 4 times for specific medical tests. The author makes its best job to explain highly dense and complex subjects which are practically impossible to explain. And he does it like he was explaining it to a 4-year-old. But beware, you don't necessarily want to have brain fog when you read this book. It is very rich. It disabuses you of several genetic myths and by the look of it, I would say that practically all of us have some myths about genes.

It made me think about the context and possibilities explained in this session as well:

Session 11 October 2014

Some excerpts:

Q: (Pierre) So you contract a sickness because the soul wants to learn something and experience something, and it's through this sickness that this learning will occur?

A: No. The soul and its helpers wants to trigger DNA modification!

Q: (L) They're saying no, that it's far more pragmatic. Okay, next question... I'm never going to get to my questions that I have! [laughter] Okay, when you say, "the soul and its helpers", what the heck are the soul's helpers?

A: Tribal unit members both in the body and out.

Q: (L) So if you're a member of a tribal unit, you are in a way connected via DNA connections or signals or frequencies with your other tribal members, whether they are incarnated or not? Is that what we're saying here?

A: Pretty much; no man is an island!

Q: (L) Okay, now, let me get back to my questions. In the rest of this session excerpt I have here, I asked:

Q: What does the rest of the DNA code for that is not coding for structural genes. What else can it be doing?

A: Truncated flow.

Q: Truncated flow of what?

A: Liquids.

Q: Liquids from where to where?

A: What is your sense?

Q: Well, what liquids?

A: Time for your input.

Q: Do some of these...

A: No. Not alright: we asked you a question!

Q: Okay. Truncated flow of liquids. I'm not even sure what that means. (A) Maybe something was flowing and something cut it off and stopped it and it cannot be developed. It means that something was cut. (L) Does truncated flow mean a flow of liquid that has been stopped?

A: Yes. Because of design alteration!

Q: Is this liquid that has been truncated a chemical transmitter?

A: Yes.

Q: And would this chemical transmitter, if it were allowed to flow, cause significant alterations in other segments of the DNA?

A: Yes.

Q: So, there is a segment of code that is in there, that is deliberately inserted, to truncate this flow of liquid, which is a chemical transmitter, or neuropeptide, which would unlock significant portions of our DNA?

A: Close Biogenetic engineering.

Q: I assume that this was truncated by the Lizzies and cohorts?

A: Close, but more likely Orion STS designers.

Q: Okay, can you tell us what this specific liquid or transmitter was truncated?

A: Think of the most efficient conductor of chemical compounds for low wave frequency charge.

Q: (A) Well, gold is one... (L) Acetylcholine?

A: No.

Q: (L) Water?

A: No.

Q: Saline?

A: Closer. It is a naturally bonding combination.
Click to expand...
(L) And then I gave up, because I was absolutely and completely clueless about where they were going, and I said I'd have to research it. And then, after I gave up on that line, I asked:

Q: Was my insight that I had one night that, at some point in time something may happen that will turn genes on in our bodies that will cause us to physically transform, an accurate perception of what could happen at the time of transition to 4th density?

A: For the most part, yes.

Q: Are there any limitations to what our physical bodies can transform to if instructed by the DNA? Could we literally grow taller, rejuvenate, change our physical appearance, capabilities, or whatever, if instructed by the DNA?

A: Receivership capability.

Q: What is receivership capability?

A: Change to broader receivership capability.

Q: (A) That means that you can receive more of something.

A: Close.

Q: (A) It means how good is your receiver.

A: Yes.

Q: (L) What is your receiver? The physical body?

A: Mind through central nervous system connection to higher levels.
Click to expand...
(L) So this takes us back to what they were just saying a minute ago about the soul being able to change DNA itself, or with its tribal unit helpers; that is, you can change your DNA through "Mind through central nervous system connection to higher levels." Now then, going back to this little bit right here about the naturally bonding combination, "the most efficient conductor of chemical compounds for low-wave frequency charge." Well, in another session they made the remark that DNA was a superconductor, and that DNA could also have a core of light. Now, what just happened to make me start thinking about all this was that there was an article:

Viruses convert their DNA into liquid form to facilitate cell infection --

Viruses convert their DNA into liquid form to facilitate cell infection

Viruses can convert their DNA from solid to fluid form, which explains how viruses manage to eject DNA into the cells of their victims. This has been shown in two new studies carried out by Lund University in Sweden.

Both research studies are about the same discovery made for two different viruses, namely that viruses can convert their DNA to liquid form at the moment of infection. Thanks to this conversion, the virus can more easily transfer its DNA into the cells of its victim, which thus become infected. One of the studies investigated the herpes virus, which infects humans.

"Our results explain the mechanism behind herpes infection by showing how the DNA of the virus enters the cell", said Alex Evilevitch, a researcher in biochemistry and biophysics at Lund University and Carnegie Mellon University.

Evilevitch stated that the discovery was surprising. No one was previously aware of the 'phase transition' from solid to fluid form in virus DNA. The phase transition for the studied herpes virus is temperature-dependent and takes place at 37°C, which is a direct adaptation to human body temperature. Evilevitch hopes that the research findings will lead to a new type of medicine that targets the phase transition for virus DNA, which could then reduce the infection capability and limit the spread of the virus.

"A drug of this type affects the physical properties of the virus's DNA, which means that the drug can resist the virus's mutations", said Alex Evilevitch.

The second study that Evilevitch and his colleagues have published recently is about bacteriophages, i.e. viruses that infect bacteria, in this case E coli bacteria in the human gastrointestinal tract. The results show that this virus also has the ability to convert its DNA from solid to fluid form. As with the herpes virus, the phase transition takes place at 37°C, i.e. adapted to human body temperature.

These two virus types, bacteriophages and the herpes virus, separated at an early stage in evolution, several billion years ago. The fact that they both demonstrate the same ability to convert their DNA in order to facilitate infection indicates that this could be a general mechanism found in many types of virus.

In previous studies, Alex Evilevitch and his colleagues have succeeded in measuring the DNA pressure inside the virus that provides the driving force for infection. The pressure is five times higher than in an unopened champagne bottle. This high pressure is generated by very tightly packed DNA inside the virus. The pressure serves as a trigger that enables the virus to eject its DNA into a cell in the host organism. It was this discovery that led to the two present studies, which were recently published in Nature Chemical Biology and PNAS.
Click to expand...
(L) So, liquid DNA... What we have here is a little bit of a confirmation of what the C's were saying way back when, that DNA can turn from a solid to a liquid – the truncated liquid reference in the context of genetic engineering - to infect a cell. But we also know that DNA can also produce beneficial changes. The likelihood is that we have DNA that infected us at some point in time. It truncated the flow of some other DNA that's in our cells already there, but it isn't doing anything because it's been blocked by an infection that inserted introns of blockages. The possibility exists, I am surmising, that at some point in time, this could be changed or reversed possibly - probably - virally. Now, am I on to something here?

A: Oh indeed! The times ahead will be most interesting especially if the network both expands to the full tribal unit strength, and many others take the initiative to move up to the next stair step.

Interesting in light of the last session.

Anyway, fascinating reading!
The idea of "junk DNA" is a misinterpretation caused by a now-out-dated almost complete misunderstanding of how DNA produces biological results.

How Many Genes Do Cells Need? Maybe Almost All of Them
How Many Genes Do Cells Need? Maybe Almost All of Them | Quanta Magazine
An ambitious study in yeast shows that the health of cells depends on the highly intertwined effects of many genes, few of which can be deleted together without consequence.

I think that to these studies could be added for your better understanding, the matter of energy quntum, since the human body is energetic by nature, it is a machine that moves with speeds and quantities, the so-called quantum of energy, then, the matter of the activation of the energy quantum that are currently applied, could be applied to the archaic human, because energy is present, past and future at the same time and it is a question of manipulating it at more or less speed, and if the body of archaic humans is an energetic machine just like the current human, then, those, the archaics, could very well have achieved a change in their DNA, and therefore have produced the changes that science calls EVOLUTION since past times.
The idea of "junk DNA" is a misinterpretation caused by a now-out-dated almost complete misunderstanding of how DNA produces biological results.

How Many Genes Do Cells Need? Maybe Almost All of Them
How Many Genes Do Cells Need? Maybe Almost All of Them | Quanta Magazine
An ambitious study in yeast shows that the health of cells depends on the highly intertwined effects of many genes, few of which can be deleted together without consequence.

This topic is covered in "Making Sense of Genes" by Kostas Kampourakis and apparently, a certain group of other scientists do NOT like the idea when applied to humans because it sort of cancels out evolution. They LIKED thinking that there was a lot of junk DNA hanging around because that was proof of evolution.


I think that to these studies could be added for your better understanding, the matter of energy quntum, since the human body is energetic by nature, it is a machine that moves with speeds and quantities, the so-called quantum of energy, then, the matter of the activation of the energy quantum that are currently applied, could be applied to the archaic human, because energy is present, past and future at the same time and it is a question of manipulating it at more or less speed, and if the body of archaic humans is an energetic machine just like the current human, then, those, the archaics, could very well have achieved a change in their DNA, and therefore have produced the changes that science calls EVOLUTION since past times.

At this point, that angle is not necessary. I'm not much interested in esoteric theories here, but rather the hard FACTS; as many of them as I can find.
Top Bottom