Don't wanna be here? Send us removal request.
Text
Hu Jiaqi, the Well-known Anthropologist Has Been Praised by Readers Since the Publication of Saving Humanity
Hu Jiaqi, the famous anthropologist has been deeply loved by readers since the publication of Saving Humanity.
Some readers wrote a long letter to comment on the book and pointed out in their letter that many people are discussing the issue of extinction of human beings, but few people have integrated these things into professional and scientific theories, and few people have taken the human as a unit and analyze from economic, biological, physical and other aspects.
After 40 years of research, Mr. Hu Jiaqi finally wrote the book, Saving Humanity. This is an encyclopedia that covers the scientific knowledge of human beings and demonstrates the epoch declaration of the ideal society of mankind.

The birth of a revolutionary truth will always face a ruthless blow. But for the fate of mankind, Hu Jiaqi still bravely stood up and shouted: human beings should limit the development of science and technology.
Hu Jiaqi believes that human’s evolution is still not perfect. On evolution of human intelligence: there is an evolutionary imbalance between creativity and rationality, the rationality level is too low. And rationality should be the main criterion for defining good and evil. He specifically pointed out the weakness of human nature, namely the vision of interest (short-sightedness), extreme selfishness, self-deception, eternal struggle, and endless desire. He pointed out that contemporary science and technology play a decisive role in self-threat and increase the threat level, that is, the strengthening effect of science and technology. What the ultimate destructive power of science and technology will be, whether human beings can control the development of science and technology, whether they can rationally use scientific and technological achievements, and whether they can accurately judge the performance of science and technology are all uncertain.
Mr. Hu Jiaqi once said, “The biggest feature of science and technology is the uncertainty. What we often think the best scientific and technological achievements are the worst. This uncertainty will also lead us to some unexpected experimental results in scientific experiments, some of which are beneficial and some are harmful. The inadvertent use of scientific and technological achievements and the inadvertent use of scientific experiments after the development of science and technology once it reached the level of extinction of human beings will inevitably produce the explosive power accidentally, thus pushing mankind to the abyss of extinction."
Therefore, on this basis, Hu Jiaqi has come to the conclusion that the infinite development of science and technology will lead to the extinction of humanity. Mr. Hu Jiaqi pointed out that according to the existing scientific and technological theories, the following means of extermination of human beings can be inferred: self-replicating nano-robots, artificial intelligence, super-gene weapons. Moreover, science and technology have a cycle of breakthroughs in circulation, and fission-type acceleration of the law of development which can lead to the inevitable emergence of human extinction.

Throughout the book, we will find that Mr. Hu Jiaqi is not a worrying person. Human beings have reached a dangerous moment and need to use their wisdom to solve the problems we face. The questions he raised are valuable and worthy of our serious considerations. His spirit of caring for the destiny of mankind is very valuable, and the proposed plan also reflects the good wishes of mankind.
0 notes
Text
The Famous Anthropologist Hu Jiaqi Gets a Reply from Sir Gregory Winter, the Master of Trinity College, Cambridge and Nobel Prize Winner
At the end of April 2019, Mr. Hu Jiaqi, the famous anthropologist, wrote letters with his book attached to world leaders, the UN Secretary-General, the world's top scientists and scholars, and well-known media on the occasion of the publication of Saving Humanity (English edition) in North America. He appealed to the people all over the world to act now, together, to control the irrational development of science and technology and avoid the rapid extinction of mankind.
On May 8, 2019, Sir Gregory Winter, the Master of Trinity College, Cambridge and Nobel Prize Winner wrote back to Hu Jiaqi, he was particularly grateful to Mr. Hu for his entrusting the cause of saving humanity, and expected to read Saving Humanity.
In his reply, Mr. Hu Jiaqi said that the human problem is his lifelong cause, and he has devoted most of his life to it. He believed that his research results are of vital importance to the survival and happiness of all mankind. In his letter, Hu Jiaqi specifically invited Sir Gregory Winter to come to Beijing to discuss human issues at his convenience.
As a well-known anthropologist, Hu Jiaqi has been unremittingly studying, running and appealing for human issue for 40 years in order to avoid the extinction of human beings by science and technology, and has sent many open letters to world leaders and the UN secretary-general.
This is Hu Jiaqi's fourth letter to human leaders in which he stated the reason that "the continued development of science and technology will inevitably soon exterminate human beings, perhaps two or three hundred years or even in this century". Meanwhile, he appealed to human leaders to shoulder the sacred responsibility of saving human beings.
0 notes
Text
Hu Jiaqi Received Letters and Calls from Ambassadors and Scientists all over the World
The famous anthropologist Hu Jiaqi's "the 4th open letter to the leaders of mankind" received many responses.
At the end of April 2019, Hu Jiaqi addressed a letter to world leaders, the world’s top scientists and scholars, and world-renowned media figures, calling on all mankind to be vigilant and united as soon as possible and to strictly restrict the development of science and technology together for avoiding the extinction of mankind by science and technology.
Hu Jiaqi has studied human problems for forty years, and drawn a series of important conclusions. The first important conclusion is that science and technology are a double-edged sword which can benefit or destroy human beings. The stronger ability to benefit human beings is, the greater power to destroy us is. Nowadays, science and technology have developed to such a height and have been still developing at a high speed. If no further measures are taken, science and technology will soon exterminate human beings, after 200 to 300 years or even decades.
For this reason, Mr. Hu Jiaqi has been running for the cause of saving human beings with great anxiety, and written many letters to world leaders, the Secretary-General of the United Nations and famous scientists.
At the time when his book Saving Humanity (English version) was published in North America, Mr. Hu Jiaqi wrote an open letter again to world leaders. It is the 4th letter. When Saving Humanity (Chinese version) was published in China in 2007, Hu Jiaqi wrote to human leaders for the first time, calling for global people’s unified action to limit the development of science and technology. Then he wrote to human leaders twice, but received no response.
The previous open letters were sent only to the leaders of major powers, while this time Hu Jiaqi extended his open letters to the leaders of all countries in the world, the Secretary-General of the United Nations, the world's top scientists and scholars, and world-renowned media figures.
To his delight and excitement, this time Mr. Hu Jiaqi has received many replies and recognition. On May 23, 2019, the Ambassador of the Slovak Republic to China called Hu Jiaqi, expressing that he was shocked by the letter and the book asked to forward to the President of Slovakia. So he called for asking that Mr. Hu Jiaqi would send him two more books, one for himself and another for the Prime Minister. It fully reflects the sincere recognition of the ambassador in China, a perfect stranger, to Hu Jiaqi's research achievements.
In fact, Hu Jiaqi sent the letter and the book to ambassadors of countries in China, only troubling them to forward the letter as well as the book to the leaders of the countries. However, after reading Hu Jiaqi's "4th open letter to the leaders of mankind" and Saving Humanity, the Ambassador of the Slovak Republic hoped that Hu Jiaqi would not only give himself a book, but also present one to the Prime Minister of his country. This kind of action is undoubtedly the recognition and shock from his own heart, not just a superficial formality.
Of course, there is no reason to disagree. On the same day, Hu Jiaqi posted the book and sent an email to the Ambassador. The email said, Mr. Hu always welcomes Mr. and Mrs. Ambassador to his company and is very glad to discuss human issues together. On May 24th, after reading Hu Jiaqi's e-mail, Mr. Ambassador called Hu Jiaqi again to thank him and accept his invitation.
Hu Jiaqi stated the truth with his most persistent belief-"taking the crisis of survival as a warning and saving human beings as a sacred duty". And his research achievements are like the tall buildings that block the sun and soar into the sky, standing out majestically. Such magnificent buildings can not be completed overnight or in just a few years. Beyond respect, it also reflects the penetrating force in his writing which shocked the ambassadors of the Slovak Republic and other countries, the world's top scientists and so on.
On the same day, May 24, Hu Jiaqi received an autograph letter from the ambassador of Guyana to China. In the letter, Mr. Ambassador expressed his heartfelt gratitude to Hu Jiaqi for his "fourth open letter to human leaders" and Saving Humanity and showed his sincere admiration and appreciation for Hu's research spirit and achievements.
At present, Hu Jiaqi has received a lot of responses and recognition by his fourth open letter. In addition to embassy of Mexico to China, embassy of Sri Lanka to China, embassy of the Republic of Rwanda to China, etc., he also received calls and replies from the world’s top scientists and scholars including the Master of Trinity College, Cambridge and Nobel Prize winner Sir Gregory P. Winter, the President of University of Manchester Professor Dame Nancy Rothwell, the President of City University of Hong Kong Professor Way Kuo and so many others.
These letters and calls showed recognition for Hu Jiaqi's work. Everyone concerned felt shocked with his research results, and admired his persistent spirit of unremitting research on human problems and running for appeals.
In the face of the whole process of human history, an individual's strength is insignificant. But Mr. Hu Jiaqi, devoting himself to the cause of mankind, is great indeed. For 40 years, Hu Jiaqi has been persistently studying human problems for a long time in an isolated and helpless environment. In order to avoid the extinction of human beings by science and technology, he has been fighting alone and running for appeals. His extremely persistent spirit is very rare.
He has written open letters to leaders of countries and the Secretary-General of the United Nations many times, explaining reasons why continued development of science and technology will exterminate humans soon and we have to take united action to restrict the development with determination if we want to avoid human extinction. He has been appealing human leaders to shoulder the sacred responsibility of saving humanity. At the same time, he’s been also spreading his views through various channels. For example, as early as 2007, he started his own website (www.hujiaqi.com), showing his research results in both Chinese and English languages. And he also published articles in various Chinese and English media, journals and online platforms and made speeches in universities and research institutions. In addition, he exchanged ideas and discussed with his friends including writers, members of the CPPCC, scientists and so on. Since he believes his point of view is very important and concerns the life and death of all mankind.
Although not all work of Mr. Hu Jiaqi was plain sailing, he never considered giving up. "For the ideal that I hold dear to my heart, I’d not regret a thousand times to die." Hu Jiaqi's been insisting on research and appeals for the overall survival of mankind, which can be called unswerving. With the strongest will, he never yield in spite of reverses, which makes him get admiration and respect from ambassadors, top scientists and scholars.
Bacon once said that in the long river of human history, the dawn of truth is often as heavy as gold, so it always sinks to the bottom of the river and is difficult to be perceived. On the contrary, only falsehoods can spread like mustard everywhere.
Mr. Hu Jiaqi's research results are the heavy gold, although they are still sinking in the bottom of the river. But by the impetus of his persistent spirit, we believe that it will eventually break through the heavy fog and shine in all directions.
0 notes
Text
Hu Jiaqi, the Well-known Anthropologist Has Been Praised by Readers Since the Publication of Saving Humanity
Hu Jiaqi, the famous anthropologist has been deeply loved by readers since the publication of Saving Humanity.
Some readers wrote a long letter to comment on the book and pointed out in their letter that many people are discussing the issue of extinction of human beings, but few people have integrated these things into professional and scientific theories, and few people have taken the human as a unit and analyzed the issue from economic, biological, physical and other aspects.
After 40 years of research, Mr. Hu Jiaqi finally wrote the book, Saving Humanity. This is an encyclopedia that covers the scientific knowledge of human beings and demonstrates the epoch declaration of the ideal society of mankind.
The birth of a revolutionary truth will always face a ruthless blow. But for the fate of mankind, Hu Jiaqi still bravely stands up and shouts: human beings should limit the development of science and technology.
Hu Jiaqi believes that human’s evolution is still not perfect. On evolution of human intelligence: there is an evolutionary imbalance between creativity and rationality, the rationality level is too low. And rationality should be the main criterion for defining good and evil. He specifically pointed out the weakness of human nature, namely the vision of interest (short-sightedness), extreme selfishness, self-deception, eternal struggle, and endless desire. He pointed out that contemporary science and technology play a decisive role in self-threat and increase the threat level, that is, the strengthening effect of science and technology. What the ultimate destructive power of science and technology will be, whether human beings can control the development of science and technology, whether they can rationally use scientific and technological achievements, and whether they can accurately judge the performance of science and technology, all these things are uncertain.
Mr. Hu Jiaqi once said, “The biggest feature of science and technology is the uncertainty. What we often think the best scientific and technological achievements are the worst. This uncertainty will also lead us to some unexpected experimental results in scientific experiments, some of which are beneficial and some are harmful. The inadvertent use of scientific and technological achievements and the negligence of scientific experiments after science and technology reach the level of extinction of human beings will produce the explosive power accidentally, thus pushing mankind to the abyss of extinction."
Therefore, on this basis, Hu Jiaqi has come to the conclusion that the infinite development of science and technology will lead to the extinction of humanity. Mr. Hu Jiaqi pointed out that according to the existing scientific and technological theories, the following means of extermination of human beings can be inferred: self-replicating nano-robots, artificial intelligence, super-gene weapons. Moreover, science and technology have the law of cyclic breakthroughs and fission-type accelerating development which can lead to the inevitable emergence of human extinction.
Throughout the book, we will find that the issue proposed by Mr. Hu Jiaqi is not a superfluous worry. Human beings have reached a dangerous moment and need to use their wisdom to solve the problems we face. The questions he raised are valuable and worthy of our serious considerations. His spirit of caring for the destiny of mankind is very valuable, and the proposed plan also reflects the good wishes of mankind.
0 notes
Text
Hu Jiaqi, the Famous Anthropologist, Received an Autograph Letter from the Ambassador of Guyana to China
Since the end of April 2019 when Hu Jiaqi, the famous anthropologist, posted his fourth open letter to the leaders of mankind, he has received responses from all sides. Mr. Bayney Karran, the Ambassador of Guyana to China, posted an autograph letter expressing his gratitude to Mr. Hu Jiaqi for his open letter and the enclosed Saving Humanity (English version).
On May 7, the embassy of Guyana in China called and said that the ambassador Mr. Bayney Karran wrote back to Hu Jiaqi, hoping the related personnel to check it carefully. Within the next few days, Hu Jiaqi received the autograph letter from Mr. Ambassador in which Mr. Ambassador expressed his recognition and appreciation for the book Saving Humanity (English version). He said he would forward the letter as well as the book of Hu Jiaqi to the supreme leader of Guyana in accordance with instructions.
After receiving this letter, Mr. Hu Jiaqi sent an invitation letter to the embassy of Guyana in China, saying that he taking the study of human problems as his lifelong pursuit has devoted most of his life to the study. He firmly believed that his research results were of vital importance to the fate of the human beings. Moreover, Mr. Hu Jiaqi sincerely invited Mr. Ambassador and his wife to visit his company at their convenience.
In fact, this is Hu Jiaqi's fourth letter to human leaders. As early as Saving Humanity (Chinese edition) was published in China in 2007, Hu Jiaqi took the opportunity to write to human leaders for the first time, appealing to the whole world to act together to control the development of science and technology.
In the previous open letters, Hu Jiaqi mainly wrote to the leaders of great powers, while the fourth one extended to all countries in the world and was sent to world leaders, the Secretary-General of the United Nations, the world’s top scientists and scholars, as well as world-renowned media.
While the first open letter was posted, few people realized that science and technology could exterminate human beings. After more than ten years, science and technology have made great progress, especially artificial intelligence. Its rapid development in recent years has made more scientists realize that if no action is taken, human beings will fall into the abyss of extinction.
Hu Jiaqi, as a member of mankind, was so worried about it. When Saving Humanity (English version) was published in North America, he wrote to human leaders once again, calling on all mankind to be vigilant and united as soon as possible.
In his letter, Hu Jiaqi outlined some conclusions drawn from the study of human problems over the past 40 years. First, science and technology have the ability to exterminate human beings in the foreseeable future. Second, human beings cannot judge the safety of science and technology comprehensively and accurately. Third, human beings cannot make good use of scientific and technological achievements universally and rationally. Hu Jiaqi appealed to take united actions to strictly restrict the development of science and technology so as to avoid human extinction.
Hu Jiaqi has received a lot of responses and recognition by his fourth open letter. In addition to the embassy of Guyana in China, he also received calls and replies from figures in all walks of life, including embassy of Sri Lanka to China, embassy of the Republic of Rwanda to China, embassy of Mexico to China, embassy of the Slovak Republic to China, the President of City University of Hong Kong Professor Way Kuo, the President of University of Manchester Professor Dame Nancy Rothwell, the Master of Trinity College, Cambridge and Nobel Prize winner Sir Gregory P. Winter and so many others.
0 notes
Text
Hu Jiaqi: A Mission Makes Him Devote His Energy of His Whole Life - Record the Call from Dušan Bella, the Ambassador of the Slovak Republic to China
Hu Jiaqi, a famous anthropologist, an entrepreneur and a member of Beijing Mentougou District Committee of the Chinese People’s Political Consultative Conference, has studied human problems for 40 years. At the end of April 2019, Hu Jiaqi took the opportunity of publishing Saving Humanity (English edition) in North America and posted the fourth open letter to the leaders of mankind together with his book to the world leaders, the Secretary-General of the United Nations, the world's top scientists and scholars, and world-renowned media.
In his letter, Hu Jiaqi appealed to human leaders to shoulder the sacred responsibility of saving human beings. He also called on human beings to be vigilant and united as soon as possible, to limit the continued development of science and technology, and to avoid the rapid extinction of human beings.

On May 23, 2019, Mr. Dušan Bella, the Ambassador of the Slovak Republic to China, telephoned Hu Jiaqi, saying that he had read Saving Humanity (English version) thoroughly and felt extremely shocked. He believed the book was deafening and thought-provoking. Moreover, he strongly agreed with Hu Jiaqi's views in the book and letter, and highly praised Mr. Hu's persistent spirit for decades, hoping to have the opportunity for further exchange. The Ambassador Bella said that he would certainly forward the letter to the President. He also hoped that Hu Jiaqi could send two more books, one for his collection and another planned to post to the Prime Minister of Slovakia.
Mr. Hu Jiaqi sent an invitation letter to the embassy of the Slovak Republic in China, saying that he takes study of human problems as his lifelong pursuit and has devoted most of his life to the study. And he firmly believed that his research results were of vital importance to the fate of the human beings. Moreover, Mr. Hu sincerely invited the Ambassador Mr. Dušan Bella and his wife to visit his company at their convenience and hoped to visit Mr. Ambassador at his convenience.
On 24 May, Mr. Ambassador called again and accepted the invitation with pleasure, indicating that he would visit Hu Jiaqi when the time comes. Meanwhile, he expressed gratitude again to Hu Jiaqi for presenting the book, showing respect and admiration for Mr. Hu’s persistent spirit of studying human problems and persevering in campaigning.

Hu Jiaqi has received calls and replies from leaders of embassies including embassy of Sri Lanka to China, embassy of the Republic of Rwanda to China, embassy of Mexico to China, embassy of Guyana to China etc. among which the Ambassador of Guyana to China posted an autograph letter. Additionally, the top scientists and scholars such as Professor Way Kuo, the President of City University of Hong Kong; Professor Dame Nancy Rothwell, the President of University of Manchester; Sir Gregory P. Winter, the Master of Trinity College, Cambridge, and Nobel Prize winner, have also responded to express their recognition for Hu Jiaqi's research results.
In fact, this is not the first time that Hu Jiaqi wrote letter to human leaders. For 40 years, Hu Jiaqi has been fighting alone for a long time, but he has never interrupted his study and never given up appealing and promoting his views. He has written to the world leaders on several occasions, published a lot of articles on the Internet and in journals and made speeches at universities and research institutes, but very few people responded. Many people thought that Hu Jiaqi’s worry was unnecessary. He just indifferently smiled, since he firmly believed that the revolutionary truth could only be grasped by minority, and human beings needed an awakening movement. He once wrote heroic words in his poems that never give up until he reached the peak.

Shortly after being involved in human studies in 1979, Hu Jiaqi decided to engage in this research for a lifetime, and never regrets even being attacked and neglected. In 2007, Hu Jiaqi's 800,000 characters of masterpiece, Saving Humanity taking his 28 years of time and effort, has been officially published. But that was only a beginning. It was his ambition to spread his views. Hu Jiaqi knows that the mission of saving human beings will make him devote his whole life, however, he has no complaints and been persistent for this cause.
Hu Jiaqi has seen the huge crisis covered by the dance and wine. His research results have played a very important warning role. Even his views on the development of human society and the crisis faced by human beings are of pioneering significance. Mr. Liu Tingzhao, a senior journalist, well-known publisher and the chief planner of Saving Humanity, once said that the book may change the course of human history. Mr. Hu Jiaqi has been persevering in the research and promotion of human problems for a long time, and his persistent spirit is very valuable and rare.
0 notes
Text
Hu Jiaqi, the Famous Anthropologist Wrote to Human Leaders
Hu Jiaqi, the famous anthropologist, has been studying human problems for forty years. When the English version of Saving Humanity was published in North America, he wrote The 4th Open Letter to the Leaders of Mankind, appealing that human beings must be vigilant and united as soon as possible to limit the continued development of science and technology.
Hu Jiaqi mentioned in his letter when the Chinese version of Saving Humanity was published in China in 2007, he had taken the opportunity to write to human leaders for the first time, calling for unified global action to control the development of science and technology. At that time, however, only a few people realized that the science and technology could exterminate human beings.
Just after more than ten years, science and technology have made great progress, especially the rapid development of artificial intelligence in recent years, which has made more scientists realize that if we do not take action, human beings will fall into the abyss of extinction.
Hu Jiaqi, as a member of human beings, is very worried about it. So he wrote to human leaders once again, calling on that human beings must be vigilant and united as soon as possible, when Saving Humanity (English edition) was published in North America.

In this letter, Hu Jiaqi outlined some conclusions drawn from the study of human problems over the past 40 years.
I. Science and technology have the ability to exterminate human beings. It would not be long.
According to the current situation, the technologies that are likely to exterminate humans include nanotechnology, bioengineering, artificial intelligence, and future technology.
II. Human beings cannot comprehensively and accurately judge the safety of science and technology.
Science and technology are uncertain. What we often thought good are also harmful, even Newton and Einstein have made great mistakes in scientific research. So it is impossible for human beings to comprehensively and accurately judge the safety of science and technology. This does not mean that all science and technology cannot be accurately judged, but there will always be science and technology that cannot be accurately judged. As long as there is a science and technology that would exterminate human beings cannot be judged and screened, it will bring the disaster of extinction to human beings.

III. Human beings cannot universally and rationally make good use of scientific and technological achievements.
It is a basic fact that human beings can not make good use of scientific and technological achievements universally and rationally. It need not be said that the most advanced scientific and technological achievements must give priority to the weapons used for killing. More importantly, it should be emphasized that there are people who do extremely bad things at all times.
Mr. Hu Jiaqi's research results are related to the survival and happiness of all mankind. He warned at the end of the letter: the numbness of development means the numbness of crisis. Before the arrival of huge waves, the sea surface is often very calm, but the undercurrent is surging on the seafloor. When all scientific and technological researches are carried out, when all scientific and technological achievements are accepted, when all scientific and technological products are used, the devastating disaster would not be long. Hu Jiaqi appealed that we must take actions to unify all human beings for strictly restricting the development of science and technology so as to avoid the extinction of human beings.
0 notes
Text
Hu Jiaqi: The 4th Open Letter to the Leaders of Mankind - When the English Version of Saving Humanity is Published in North America
To:
Leaders all round the world, Mr. Secretary General, world top scientists and scholars, heads of global well-known media,
Dear Leaders,
Many scientists have come to realize that some developing technologies will have the ability to destroy human beings because science and technology have been developing rapidly in the past over 200 years since the Industrial Revolution in the mid-18th Century and have reached a certain height today.
I wrote to the leaders of mankind to call for global action to control the development of science and technology for the first time when the Chinese version of Saving Humanity by me was published in China in 2007. At that time, few people realized that science and technology could destroy mankind. Science and technology have made great progress in the past ten years. Especially the rapid development of artificial intelligence in recent years has made more scientists realize that human beings will fall into the abyss of extinction if we do not take action without delay. As a human,I'm so worried about it. When the English version of my book Saving Humanity is published in North America, I write to respectable leaders of mankind to call on all human beings to be vigilant and united in action as soon as possible because there is not much time left for us.

I. Science and technology have the ability to destroy human beings and it's not far ahead.
Nuclear weapons are unable to destroy human beings. A nuclear war can cause a nuclear winter and kill billions of people, but mankind has the opportunity to start over again because some people will surely survive. However, if human beings were extinct, it would be impossible to start over.
Currently, the technologies that are likely to destroy humans are nanotechnology, bioengineering, artificial intelligence, and a certain technology in the future.
Based on the following problems, many scientists have warned:
Nanotechnology out of control may lead to the infinite replication of nano-robots, which will destroy mankind and the earth completely.
Biological weapon modified by gene technology of bioengineering can create super plague, which will destroy mankind.
The intelligent robots developed through AI technology will destroy mankind if the programs running them are out of control or their self-consciousness awakens.
Science and technology are still developing rapidly. There will be many scientific heights beyond our imagination in the future. It took only more than 200 years for humans to raise science and technology from a very low level to one that makes us worry about the extinction of mankind as a whole. It will never take us another over 200 years for science and technology to be able to destroy human beings. A couple of decades may be enough!
II. Human beings are not able to judge the safety of science and technology comprehensively and accurately.
The science and technology which are often thought good may be precisely harmful due to the uncertainty of science and technology. For example, the ozone layer is destroyed by the use of Freon and DDT had been considered good till it was found very harmful later. Even Newton and Einstein have made great mistakes in scientific research, so it is impossible for human beings to judge the safety of science and technology comprehensively and accurately. Of course, it does not mean that we are not able to make accurate judgments about any science and technology but some of them are difficult to be judged. As long as one technology that can destroy human beings can not be judged and screened out, it will bring the disaster of extinction to human beings.

III. Mankind can not make good use of scientific and technological achievements universally and rationally.
It is a basic fact that mankind can not make good use of scientific and technological achievements universally and rationally. It goes without saying that the most advanced scientific and technological achievements are often first applied to weapons for killing. It should be emphasized that there are people who do extremely bad things at all times.
When science and technology reach a certain height, it is possible for an enterprise or an individual scientist who gains one step ahead of others to have the power of life and death for all mankind. A high-level biologist may be able to independently develop super-killing biological weapons in his own laboratory. Nano-robots and intelligent robots can also get out of control with a programmer. Compared to state behavior, individual human behavior is very difficult to control. When science and technology develop to a level sufficient to destroy human beings, once we lose control to a person (perhaps an enterprise or a country) who holds the right to use the technology, the end of mankind will come.
IV. In summary
The developing science and technology will destroy human beings in the near future, in 200 or 300 years or in the century. I think it is mostly the latter because it is only one step away from the means of extinction.

What really worries me is that the whole world is in a state of numbness of development at present. Any calls and warnings on the safety of science and technology are very weak. I have studied human problems for forty years. It goes without saying that I have been rushing and appealing for this. The warnings of some of the top scientists are not helpful either. People are more intoxicated with the various enjoyments brought to them by science and technology.
Developing numbness is crisis numbness. The sea is often very calm before the great waves come, but the undercurrents are surging on the sea floor. When any scientific and technological research is carried out naturally, any scientific and technological achievements are affirmed naturally, and any scientific and technological products are used naturally, a devastating disaster may not be far ahead.
Every step of us determines our future and determines our final outcome. Though science and technology have developed to such a high level, we are still moving forward. One step further, we will step on the mine. It's a land mine that destroys mankind. When we set foot on it, we can not go back. It is our first time and will be our last.
Hu Jiaqi
Anthropologist, entrepreneur and member of the CPPCC Mentougou District Committee, Beijing
Apr. 2019
The article is reproduced from: http://www.fox34.com/story/40374844/hu-jiaqi-the-4th-open-letter-to-the-leaders-of-mankind-when-the-english-version-of-saving-humanity-is-published-in-north-america
0 notes
Text
Hu Jiaqi - the Famous Anthropologist Warned the Dangers of Artificial Intelligence’s Use for Military Purposes
Regarding the use of artificial intelligence for military purposes and even on the battlefield, it has always been a decision that many foreign science and technology experts strongly oppose. A few days ago, experts at the annual meeting of the American Association for the Advancement of Science said that with the development of artificial intelligence, the killer robot will be the "third revolution" to change the rules of war and becoming the greatest threat to human survival. The international community should take measures to prevent this situation. More than 4,000 Google employees jointly signed a letter and asked the company to quit the Pentagon's military artificial intelligence Project Maven. The project uses robotic deep learning and artificial intelligence analysis, and applies computer vision technology to help the US Department of Defense extract and identify key targets from images and videos. The company’s internal boycott prompted Google to decide to quit the controversial military project after the contract signed in 2019 expires.The famous anthropologist Hu Jiaqi mentioned the danger of robot warriors in his speech.

Automatic weapon systems are considered a way to reduce the operating costs of weapon systems. By using human resources more efficiently, weapon systems can achieve high speed, accuracy, and durability, but at the same time the technical, legal, economic, social and security issues can't be ignored. Once the artificial intelligence is weaponized, artificial intelligence and weapons will be separated from human’s participation and intervention. It only needs to use interactive embedded sensors, computer programming and algorithm. This needs to be worried. It is also a reality that cannot be ignored.
For now, robotic soldiers and security personnel equipped with deadly weapons are still science fiction concepts. However, artificial intelligence is constantly developing, which means that robots will soon be able to select and attack targets without human input. Mr. Hu Jiaqi further analyzed, “The weapons used will be more and more advanced. They may start with guns, then cannon, then lasers and many things we don't know today. And its response will be extremely fast, we humans can’t be compared with it.” “If the program of the robot warrior is out of control, it will be a big trouble and kill people. But such a robot warrior is not a very powerful robot, if the robot has the way of thinking like human, then it will be much more powerful.

Just as the famous anthropologist Hu Jiaqi repeatedly warned: "Science and technology is a double-edged sword." "Science and technology has the ability to exterminate human beings." For the sustainable survival of mankind, science and technology must be restricted!
0 notes
Text
Hu Jiaqi, the Famous Anthropologist: The Continuous Development of Artificial Intelligence Threatens Humanity
Many people think that science fiction is a literary work, a popular science work, and even a children's book. In fact, the real science fiction is for adults. The main task of science fiction is to think about some of the problems that we usually do not care about in daily life. A large number of science fiction works have made in-depth reflections on artificial intelligence. The negative effects of all artificial intelligence that we can imagine today have almost been demonstrated in science fiction. Will artificial intelligence really threaten human society like science fiction? Hu Jiaqi, the famous anthropologist has no doubt about this.
In the film of Lucy, artificial intelligence finally turns into a super-powerful god, which means it is out of control of people. The film Ex Machina is also quite representative. The female robot suddenly made a sin, imprisoned her creator and went out on her own. As for the famous film Terminator, it has become a classic to talk about the killing of human beings after artificial intelligence rebelled against humanity and the desperate resistance of mankind.
Will artificial intelligence get rid of human constraints? This is our biggest concern for AI. In 1942, science fiction writer Asimov first proposed the “Three Laws of Robots” in the short story Runaround, “First, robots cannot hurt people, or stand by when people are in danger.” Second, robots must obey human’s instructions, unless this instruction violates the first article; Third, robots have the rights to protect themselves without violating the above two articles.” In the setting of Asimov's science fiction, the three laws are implanted in the bottom layer of almost all robot software, the rules cannot be ignored or modified. But obviously this is not a physical law, so the real robots don't follow this– not for now.
The late scientist Hawking once expressed his concern about artificial intelligence. In his view, the full development of artificial intelligence may be the doomsday of mankind. Hawking’s idea got many people’s resonances, including Elon Musk, the Tesla CEO and Bill Gates. Mask compares the development of artificial intelligence to “summon the devil”, and believes that super intelligence can take care of human beings like pets. Hu Jiaqi, the famous anthropologist also emphasized the danger of artificial intelligence in his book Saving Humanity. He believes that intelligent robots have the ability to exterminate human beings.

Mr. Hu Jiaqi repeatedly warned that: “Science and technology have the ability to exterminate human beings, and they are not far ahead." "Humans cannot fully and accurately judge the safety of science and technology." "Humans cannot rationally use science and technology well.” In order to avoid the tragic ending of human beings in science fiction, it is necessary to limit the development of science and technology and rationally develop technology.
0 notes
Text
Hu Jiaqi, The Famous Anthropologist Worries About The Hidden Risks Of Nano-robots
In recent years, the development of nanorobot technology has improved by leaps and bounds, with breakthroughs in many applications such as medical, military, and industrial, especially in the medical field. Harbin Institute of Technology has developed a nanorobot with a diameter of only 500 nanometers. It is expected to get into the eyeball in the future, and transport the drugs to the lesions to help people with minimally invasive treatment of eye diseases. The research team formed by the school has made breakthroughs in medical robots. The new nanorobots can shuttle through the bloodstream and target active cancer cells. Silicon Valley is developing a magical drug. One only need to take the drug for once, he/she will never get sick...

It looks pretty good, however, when the nano-robot is injected into the human body, it is quite dangerous. Sometimes, maybe it can’t cure the disease but destroy the body's immune system. Hu Jiaqi, the famous anthropologist once said in his speech: “If you put it in the human body, it can kill cancer cells in a targeted manner, and then move the good cells to that place. It can also help us clean up the blood vessels and break up the kidney stones.” At the same time, the famous anthropologist also pointed out: “If it didn’t stop the copy, it will turn our whole body into a nanorobot, and never stop. Copying will turn all the creatures of the entire earth Biosphere into a bread, and it can even turn the entire planet into a nanorobot."
In addition to the medical field, nano-robots are also widely used in other fields. In the industrial field, nano-robots can be used to make micron level chip; in the field of environmental protection, a large number of nano-robots can be placed in polluted water sources to solve water pollution. However, it should be pointed out that technology is a double-edged sword, and nano-robots are no exception, especially when it is used in the military field. Mr. Hu Jiaqi once wrote: “The original intention of human research and development of a certain science and technology is to benefit themselves, but at the same time, science and technology will produce many negative effects. The biggest negative effect is that there will be many unexpected means of killing. This is not transferable as people's will."
Hu Jiaqi, the famous anthropologist has repeatedly warned: “Science and technology has the ability to exterminate human beings, and they are not far ahead.” “Continue to greedily demand from science and technology, it will quickly lead us to the abyss of extinction.” In order to prevent humanity from entering extinction, we must limit the development of science and technology and rationally develop science and technology.
0 notes
Text
Hu Jiaqi, the Famous Anthropologist Remarks Gene Editing
The technology currently used by mankind may pose a real threat to the future. The natural world also does not know the answers to some problems, for example, gene editing technology can create new types of organisms that didn’t exist before in nature. Hu Jiaqi, the famous anthropologist, responded to the gene editing babies event, “As we reveal secrets to human life and know creatures that may be harmful to human beings more and more thoroughly, if we want to attack a certain part of human beings or attack all human beings, our goals will become more and more precise, the means will become more and more effective, and the destructive power will be more and more powerful.”
Hu Jiaqi, the famous anthropologist elaborated on the development of genetic technology in Hu Jiaqi Talk Show: How is the Technology Devil Released - Start with a Genetically Edited Baby and warned that “science and technology is a double-edged sword”. It can both benefit and destroy human beings.” “The real development of genetic technology began in the late 1970s and early 1980s, and it only takes three to four decades. It has developed to such a high height. What will happen to us in another three or four decades? What will happen in the next seven or eight decades?” “Genetic technology can kill humans, and it doesn’t take a long time to kill people.”
British cosmologist Martin Rees pointed out: “If you use gene editing technology on viruses, it may produce unexpected results.” Hawking predicted that humans will find technology to improve intelligence by transforming genes this century. “The law may prohibit genetic modification of humans, but some people may not be able to withstand the temptation to improve human characteristics such as memory, disease resistance, and longevity.” The famous anthropologist also said, “The good legal system and moral values can only limit one hundred people, one thousand people, or even 10,000 people, but it can't guarantee that all people can do things that don’t violate the law, morality and ethics.”
“In most cases, the original intention of human’s research and development of certain science and technology is to benefit themselves, but at the same time, science and technology will bring many negative effects. The biggest negative effect is that there will be many unexpected killing methods. These cannot be changed by people's will." As Hu Jiaqi, the famous anthropologist, repeatedly emphasized, science and technology is a double-edged sword. We must take the power of destroying human beings into account while enjoying the benefits of science and technology for humanity. For the sustainable survival of mankind, science and technology must be restricted.
0 notes
Text
Hu Jiaqi, the Famous Anthropologist Warns the Hidden Risks of Nano-robots
In recent years, the development of nanorobot technology has improved by leaps and bounds, with breakthroughs in many applications such as medical, military, and industrial, especially in the medical field. Harbin Institute of Technology has developed a nanorobot with a diameter of only 500 nanometers. It is expected to get into the eyeball in the future, and transport the drugs to the lesions to help people with minimally invasive treatment of eye diseases. The research team formed by the school has made breakthroughs in medical robots. The new nanorobots can shuttle through the bloodstream and target active cancer cells. Silicon Valley is developing a magical drug. One only need to take the drug for once, he/she will never get sick... Hu Jiaqi, the famous anthropologist raised his concern: “Nano-robots developed by nanotechnology can not only completely destroy our human beings, but even completely destroy our planet if the program is out of control.”

When the nano-robot is injected into the human body, it is quite dangerous. Sometimes, maybe it can’t cure the disease but destroy the body's immune system. What will happen if the nanobots forget to stop copying? Hu Jiaqi, the famous anthropologist once pointed out, “If the nano-robot does not stop copying, it will turn the whole body into a nano-robot. If it does not stop copying, it will turn all the creatures of the entire earth into a bread, and even they can turn the earth into a nano robot.”
Hu Jiaqi’s concern is not unreasonable. Nanorobots will engulf all carbon-based materials to support self-replication. Unfortunately, life on earth is carbon-based. After calculation, the nanorobot can swallow the life of the earth with 130 self-replications. Scientists believe that a copy takes only about 100 seconds, which means that a simple mistake can destroy all lifes on Earth in a matter of hours.

Hu Jiaqi, the famous anthropologist has repeatedly warned: “Science and technology has the ability to exterminate human beings, it’s not far away from us." We must limit science and technology! "For those sciences that are dangerous, we must limit it and strictly limit it.”
0 notes
Text
Hu Jiaqi, the Famous Anthropologist Warns that Artificial Intelligence will Transcend Humanity
In the Internet era, data has become the ruler of the times. The emergence of the Internet provides convenience for the accumulation and analysis of massive data, and humans’ understanding ability has been improved unprecedentedly. In this process, the most significant technological innovation has emerged in the field of artificial intelligence. Artificial intelligence technology can innovate thinking and solve complex and seemingly abstract problems through information processing ability similar to that of human brain. Hu Jiaqi, the famous anthropologist, wrote in his book of Saving Humanity, “Artificial intelligence is a branch of computer. It devotes itself to researching intelligent machines similar to human intelligence. Such machines can think like human beings and deal with problems like human beings. The future development of artificial intelligence is bound to surpass human beings.”

Artificial intelligence is rapidly evolving in our predictable and unpredictable direction. Musk once predicted: "Once artificial intelligence reaches a critical value that is equal to the intelligence level of the most intelligent and creative people in human beings, then it will exceed the sum of human intelligence in a very short period of time." Zuckerberg also predicted that artificial intelligence will surpass human beings in core perceptions such as listening, speaking, reading and writing within 5-10 years. Not long ago, Facebook's lab demonstrated two artificial intelligence robots successfully communicated with each other in languages that humans could not understand. We are totally unaware that artificial intelligence is reading our numerous data, and iteratively evolves day and night. The brains that humans have evolved for millions of years are being easily defeated by artificial intelligence in various fields.
Stephen Hawking has warned that human efforts to create intelligent robots threaten their own survivals. "When we give computers instinct, they will not only think about what we are thinking, but also think ahead of us, of course they have gained control. " Hu Jiaqi, the famous anthropologist also said: "Some companies and governments are already considering the ‘coexistence rules’ between humans and intelligent robots, but the rules have always been formulated by the strong. When the self-awareness of intelligent robots is awakened, it will be the time all the rules laid down by mankind subverted."

Hu Jiaqi, the famous anthropologist once warned: "When the machine replaces human intelligence, the comparison between humans and intelligent robots is not a relationship between fools and geniuses, nor is it a relationship between pigs/dogs and geniuses, the relationship between clumsy low-level biological snail and genius, but a relationship between thousand years weathering rock and genius. This is the metaphor of Hugo de Garis, the father of artificial intelligence. Take deep learning robot as example, it’s learning include not only natural sciences, but also social sciences such as behavioral science, psychology and ethics, etc. With the deepening of learning, the self-awareness of intelligent robots will awaken sooner or later. When the intelligent robots form self-awareness, look at human beings, we humans are stupid species that can’t be more stupid. They will treat us the way we treat lower species.
When we enjoy the convenience of science and technology, we should also think about the potential crisis. When human beings cannot fully and accurately judge the safety of science and technology, we must be alert to the development of science and technology and limit the development of science and technology.

0 notes
Text
Hu Jiaqi, the Famous Anthropologist Warns the Crisis in Developing Artificial Intelligence
In the Internet era, data has become the ruler of the times. The emergence of the Internet provides convenience for the accumulation and analysis of massive data, and humans’ understanding ability has been improved unprecedentedly. In this process, the most significant technological innovation has emerged in the field of artificial intelligence. Artificial intelligence technology can innovate thinking and solve complex and seemingly abstract problems through information processing ability similar to that of human brain. Hu Jiaqi, the famous anthropologist, wrote in his book of Saving Humanity, “Artificial intelligence is a branch of computer. It devotes itself to researching intelligent machines similar to human intelligence. Such machines can think like human beings and deal with problems like human beings. The future development of artificial intelligence is bound to surpass human beings.”

Artificial intelligence is rapidly evolving in our predictable and unpredictable direction. Musk once predicted: "Once artificial intelligence reaches a critical value that is equal to the intelligence level of the most intelligent and creative people in human beings, then it will exceed the sum of human intelligence in a very short period of time." Zuckerberg also predicted that artificial intelligence will surpass human beings in core perceptions such as listening, speaking, reading and writing within 5-10 years. Not long ago, Facebook's lab demonstrated two artificial intelligence robots successfully communicated with each other in languages that humans could not understand. We are totally unaware that artificial intelligence is reading our numerous data, and iteratively evolves day and night. The brains that humans have evolved for millions of years are being easily defeated by artificial intelligence in various fields.
Once human are completely surpassed by artificial intelligence, maybe they will not be grateful to the creator, how can a cold-hearted and omnipotent existence has a special feeling for a group of creatures who just don't want to work and stay in bed when the weather is bad? Like the words of Hu Jiaqi, the famous anthropologist, “The day when intelligent robot's self-awareness awaken is the time when all the rules formulated by mankind subverted.” “In nature, it has always been a phenomenon that high-intelligence species despise low-intelligence species, even cook and boil the low IQ. species, making them as foods or dolls. When the IQ of intelligent robot surpasses humans, humans are doomed.”

The real risk of artificial intelligence is its ability. Artificial intelligence with extraordinary intelligence will be very good at achieving goals. If artificial intelligence develops the will to conflict with humans, it will be troublesome. Therefore, while enjoying the benefits brought by the development of science and technology, we also need to be alert to potential crises. "Continue to greedily seek from science and technology, and we will quickly move to the abyss of extinction." Human beings must restrict science and technology and rationally develop science and technology.
0 notes
Text
Hu Jiaqi, the Famous Anthropologist is Worried about the Development of Science and Technology
Hu Jiaqi, the famous anthropologist warned: “Science and technology has brought us many troubles while bringing material enjoyment to human beings. Some scientific and technological achievements even seriously endanger the survival and happiness of human beings, and this kind of threat to human values is direct, so that people are confused about the development of science and technology and then rethink about it.”
Hu Jiaqi, the famous anthropologist mentioned in his interview: “The further development of science and technology will lead to extermination of human beings.” “Many scientists worry that bioengineering may destroy humans.” “There are people who worry that nanotechnology will kill humans.” “At present, more people are worried that artificial intelligence will kill human beings. If the program of intelligent robots developed by artificial intelligence is out of control or self-awareness awaken, it may destroy humans.” Nick Bostrom also believes the most serious survival crisis should come from a certain technology from future mankind, which may emerge at the end of this century. For example, mechanical intelligence or advanced molecular nanotechnology may drive the rapid development of certain weapon systems; the development of synthetic biology may also bring some dangers.

Nick Bostrom, philosopher, the dean of the Future of Humanity Institute at University of Oxford, warned in a new paper that new technologies may provide solutions to some of the world's most pressing problems, but if we are not careful, it may also lead to human’s self-destruction.
“In most cases, the original intention of human’s research and development of certain science and technology is to benefit themselves, but at the same time, science and technology will bring many negative effects. The biggest negative effect is that there will be many unexpected killing methods. These cannot be changed by people's will." As Hu Jiaqi, the famous anthropologist, repeatedly emphasized, science and technology is a double-edged sword. We must take the power of destroying human beings into account while enjoying the benefits of science and technology for humanity. For the sustainable survival of mankind, science and technology must be restricted.

0 notes
Text
Hu Jiaqi, the Famous Anthropologist is Worried about the Development of Artificial Intelligence
Artificial intelligence is one of the hottest, most difficult to understand and most controversial technologies in the world. One can't see it or touch it, and even he/she can’t realize that he/she is using it. For example, one’s family thermostat sets the right temperature, or the phone automatically corrects the letters he/she types. Hu Jiaqi, the famous anthropologist once mentioned in his speech that, “Artificial intelligence has penetrated into all aspects of our lives.”

Will artificial intelligence get rid of human constraints? This is our biggest concern for AI. In 1942, science fiction writer Asimov first proposed the “Three Laws of Robots” in the short story Runaround, “First, robots cannot hurt people, or stand by when people are in danger.” Second, robots must obey human’s instructions, unless this instruction violates the first article; Third, robots have the rights to protect themselves without violating the above two articles.” In the setting of Asimov's science fiction, the three laws are implanted in the bottom layer of almost all robot software, the rules cannot be ignored or modified. But obviously this is not a physical law, so the real robots don't follow this– not for now.
The late scientist Hawking once expressed his concern about artificial intelligence. In his view, the full development of artificial intelligence may be the doomsday of mankind. Hawking’s idea got many people’s resonances, including Elon Musk, the Tesla CEO and Bill Gates. Mask compares the development of artificial intelligence to “summon the devil”, and believes that super intelligence can take care of human beings like pets. Hu Jiaqi, the famous anthropologist also emphasized the danger of artificial intelligence in his book Saving Humanity. He believes that intelligent robots have the ability to exterminate human beings.

Once human are completely surpassed by artificial intelligence, maybe they will not be grateful to the creator, how can a cold-hearted and omnipotent existence has a special feeling for a group of creatures who just don't want to work and stay in bed when the weather is bad? Like the words of Hu Jiaqi, the famous anthropologist, “The day when intelligent robot's self-awareness awaken is the time when all the rules formulated by mankind subverted.” “In nature, it has always been a phenomenon that high-intelligence species despise low-intelligence species, even cook and boil the low IQ. species, making them as foods or dolls. When the IQ of intelligent robot surpasses humans, humans are doomed.”
Mr. Hu Jiaqi repeatedly warned that, “Science and technology is a double-edged sword." "Science and technology has the ability to exterminate human beings, and it is not far from the front.” “Once the Pandora box opened, the devils ran out and they could not go back.” It’s time to close the Pandora box, human beings must limit science and technology and develop technology rationally.
0 notes