Ai
Art
Beauty
Biotech
Business
City
Clothing
Communication
Construction
Economy
Education
Energy
Entertainment
Family
Food
Gadgets
Government
Home
Human
Love
Medicine
Nature
Privacy
Production
Robots
Society
Space
Sport
Threats
Transport
Work

A ‘plug in’ of extra memory by direct input into the brain

Some scientists fear that computers may develop ‘minds of their own’ and pursue goals hostile to humanity. Would a powerful futuristic AI remain docile, or ‘go rogue’? Would it understand human goals and motives and align with them? Would it learn enough ethics and common sense so that it ‘knew’ when these should override its other motives? If it could infiltrate the internet of things, it could manipulate the rest of the world. Its goals may be contrary to human wishes, or it may even treat humans as encumbrances. AI must have a ‘goal’, but what really is difficult to instil is ‘common sense’. AI should not pursue its goal obsessively and should be prepared to desist from its efforts rather than violate ethical norms. 

Computers will vastly enhance mathematical skills, and perhaps even creativity. Already our smartphones substitute for routine memory storage and give near-instant access to the world’s information. Soon translation between languages will be routine. The next step now in 2050 could be to ‘plug in’ extra memory or acquire language skills by direct input into the brain— though the feasibility of this isn’t clear. If we can augment our brains with electronic implants, we might be able to download our thoughts and memories into a machine. If present technical trends proceed unimpeded, then some people now living could attain immortality— at least in the limited sense that their downloaded thoughts and memories could have a life span unconstrained by their present bodies. Those who seek this kind of eternal life will, in old- style spiritualist parlance, ‘go over to the other side’. You can find more predictions in my book "On the Future: Prospects for Humanity", from which this text is taken.

I agree
38
I don't agree
1