@sonhouse said
@humy
AI singularity is the time when AI exceeds human intelligence? That is to say, the most intelligent of any human, like Witten, Hawking, Einstein, Newton, and the like?
That is just part of what I mean by AI singularity because what I personally also mean by AI singularity is that then leading to what is called "Technological singularity" specifically where the AIs not only make ever more greater improvements on themselves but on all technology, not just AI, leading to technology rapidly becoming ever more advanced and many times more advanced than the technology we have now;
https://en.wikipedia.org/wiki/Technological_singularity
There is a ridiculous common amount of layperson irrational paranoia about both AI and technological singularity NOT supported by the facts or evidence or sound logic and NOT shared by most AI experts. I now groan in despair every time I hear the usual layperson fears of AIs taking over the world; I very often hear that.
The fact is we are NO WHERE NEAR yet creating an AI with a general intelligence like that of a human and, even when we do, it should be a trivial task to instruct it specifically to NOT harm humans and NOT take over the world etc. and, given it will have no emotions thus no personal ambitions or desires to do the contrary, it won't ever break its own program even if it somehow magically could choose to do so.
Here is an article by professor Toby Walsh, an AI expert, on that with his AI expert opinions on that that I am in general agreement with;
https://www.wired.co.uk/article/elon-musk-artificial-intelligence-scaremongering
"...the problems today are not caused by super smart AI, but
stupid AI. We’re letting algorithms make decisions that impact on society. And these algorithms are not very smart. Joshua Brown discovered this to his cost last year when he became the first person killed by his autonomous car. In fact, a smarter car might have seen the truck turning across the road and saved his life. ..."
...
Now, the first thing you need to know about the singularity is that it is an idea mostly believed by people not working in artificial intelligence.
...
Most people working in AI like myself have a healthy skepticism for the idea of the singularity. We know how hard it is to get even a little intelligence into a machine, let alone enough to achieve recursive self-improvement.
...
A recent survey of 50 Nobel Laureates ranked the climate, population rise, nuclear war, disease, selfishness, ignorance, terrorism, fundamentalism, and Trump as bigger threats to humanity than AI. ..."