Technological positivists confidently predict it can. For an answer, we have to fast-forward to the year 3010, when a lecture from a Futurist puts an end to such speculation - described in this extract from a remarkable soon to be published novel.
Picture a small group of 300 year old elders served by a eugenically adapted underclass called Funcs and sophisticated humaniod robots called Hubots. All controlled by 'brain in a box' cyborgs called Neuros with their central nervous systems mounted in mechanical bodies.
Now read on:
'Without going into formulas,' Katz, the Futurist, said, 'complexity is a function of time or duration. Note that time creates complexity. Therefore, a timeless state is necessary to avoid it. Consider this thought experiment...'
The screen changed to:
AI - TIME = INERTIA.
ORGANIC INTELLIGENCE + TIME = FAILURE.
'Artificial intelligence minus duration equals inertia. The machine has power but does nothing. Obvious enough. However, to understand the second proposition, we need to examine the diff between organics and intelligent machines. Can someone mention a diff? And, by the way, it has nothing to do with conduct or ordinary knowledge.'
A white haired fem elder in the front row said, 'Relative consciousness.'
The Futurist nodded to her. 'Thank you, Erva. Despite our efforts, it's still impossible to program consciousness. It's easier to genetically adapt organics than adapt machines—to build pain into bots, sensory cognition, responsiveness to unexpected situations, emotions, risk, social interactions... You can't turn responses like surprise or anguish into numbers. How do you code an algorithm for dopamine? Computers have no experience—just run software. But let's assume, for now, that machine learning can eventually emulate a form of consciousness.' He waved a hand at Jason and Pitho. 'In fact, we have two augmented H10s in the audience that go some way toward this goal. So accepting this as a future possibility, we'll provisionally ignore this factor. What else?'
Pitho said, 'Entropy.'
'Exactly. For those unfamiliar with the term, in isolated systems, heat flows spontaneously from hot to cold. Things are caught in a process of natural disorganisation. For instance, broken glass doesn't revert to an unbroken state. This is scientifically termed thermodynamic equilibrium. Machines, likewise, are subject to entropy. Only living matter can be said to have "negative entropy". In other words, only organisms can persist, for a while at least, in the face of dissolution by absorbing nutrients, combating disease and reproducing themselves. Note also that organics are the origin of machines. So, if there are not enough organics, machines will regress.'
A Func from Combat in the fourth row spoke up. 'Not if we let bots pass the Singularity.'
'Ah! And are we all sure what that is?' He looked directly at Mark5.
Mark5 said, 'Something you can't undo?'
'For the benefit of our young friend here, Singularity spans two aspects—the Palladium Council's embargo against machine reproduction, and the possibility that AI can outstrip OI, or Organic Intelligence.'
The greasy-haired Hosie2 raised an arm. 'Hey man. You say, man, that, even if we let them build themselves, they'll tank...?'
'Good question. Let's examine it. Firstly, the Precepts forbid machine autonomy. It doesn't Conform. But let's say, for the sake of argument, that we permit bots to reproduce themselves. It still goes nowhere. Because they'll soon have maint issues.'
'Why?' the Combat Func asked.
'Firstly, for a machine to self-replicate it must extract the materials necessary and process them, estab a supply chain, set up and maintain assembly plants, secure a 100 per cent reliable energy source, estab a fail-safe temp controlled environment and many other factors. Then there'll be hardware failures. And software inevitably becomes corrupted because complex inert systems eventually degrade. Even if bots self-reproduced, they couldn't survive. Because logical inorganic intelligence inevitably develops defects.'
'Not if the coding's perfect...'
'It can't be.'
Jason raised a hand and said in his finely modulated voice, 'You're saying that organics avoid entropy but their intelligence is flawed. And that machines have perfect logic but eventually have to degrade. So if the Precepts rectify human flaws then machines could be rectified as well.'
'But do they?' The Futurist gesture controlled the screen. The Precepts appeared superimposed on their familiar waving flag background:
CONFORMITY UTILITY RIGOUR
'These famous Precepts are what?'
Mark5 decided to pitch in. 'The way to avoid human flaws.'
Katz said, 'Thank you, young man, for that patriotic sentiment. I'm sure years of tutoring by the admirable Pitho have schooled you in Safespeak. But you can speak openly here.'
'Not Safespeak. I agree with the Precepts.'
'All right. But don't do it blindly. The Precepts are a brave attempt to avoid the worst organic flaws. But as organics are intrinsically flawed, organic initiated rectification is impossible. So it follows that the Precepts are flawed, because they were formulated by organic intelligence. For instance, the first precept is Conformity. However human non-conformity originally produced machine intelligence! Therefore human non-conformity has Utility—a direct contradiction of Conformity. So the Precepts fail. Do you see the implication?'
Jason raised his hand. 'That removing human flaws would destroy the progressive refinement of machines?'
'Exactly! Exactly!' Katz stared triumphantly around the room. 'You heard it here—from the mouth of an AI!'
Hosie2 looked pleased. He'd just been told he was indispensible. 'So hey-de-hey! OI is here to stay.'
'Is it? We're about to think that through.'
The Neuro boomed, 'You neglect the third possibility.'
Katz raised an eyebrow.
'Combining organics and machines.'
'Thank you for that insight, General. But, with respect, is that possible or even advisable? Let's examine it.'
'Am I here or not?' the Neuro roared.
A nervous blink from the Futurist. 'Once again, there's a third possibility. Internal this time.'
'Which is what?'
The delicate cough. 'Perhaps Erva would care to respond?' He made encouraging gestures to the white-haired elder at the front.
She reluctantly stood and Katz ushered her behind the lectern. 'For those who haven't yet met her, Erva is from Culture Control and is Curator of Historical Archives.'
She took his place, frowning.
Like all elders in the room, she looked about eighty. Good natured, intelligent face. Thin body, slightly stooped with shoulders held defensively forward. The impression of smallish sunken breasts. He felt an immediate affinity with her.
She said in a small voice, 'I'm not sure this is a good idea.'
'Remem you have immunity,' Katz said from the side. 'Any idea raised in this room is quarantined from censure.'
'Except we're creatures with mems and resentments.'
'Get on with it,' the Neuro blared.
She stared at no one in particular as if staring into herself. Her shoulders relaxed and she stood straighter than before. 'We have a body, a mind and feelings. The mind is used for comparisons. The feelings for evaluation. And the body for the sense of self. The three together constitute the transitional form we call a human. With the three working in unison, we can assess our environment effectively. As it says in an ancient text, When two or three are together, there am I also. But, sadly, this is seldom the case. The mind randomly associates. The feeling part is generally occupied with the most trivial or basest emotions and makes the body tense. In other words, we consist of reactions. As individuals, we don't exist.'
'Ratshit,' Sarg called from the back.
She went on as if he had never spoken. 'So we live in violence, self-assertion and fear. Everyone competing with the other. All wanting to be first.'
'Separatist ratshit,' Sarg said again.
She ignored him.
The Neuro raised a long mechanical arm for silence. 'What you say, according to my mind-chip, sounds like a pastiche of Hume and Schopenhauer. I'm still waiting to hear about this famous third possibility.'
She looked at it directly. 'You asked if you're here or not.'
'My answer would be, partly.'
'You're here, but diminished by two thirds.'
'You've sacrificed your body and feelings. You are mind only.'
'I have a body,' the thing spat in its uninflected monotone.
'You have a mechanical contrivance. But not the understanding that only physical sensation can bring.'
'Wrong. I have haptics and more feedback channels than you can count.'
'It's not the same. Not physical. Just electro/neuro connections. '
'She's right,' Sarg crowed from the back of the hall. 'You're two thirds dead, you stupid arse.'
The Neuro turned like a turret from the waist with a distinctive servo whine to face him. 'I don't listen to jeers from sexual psychopaths.'
The Futurist thanked the fem called Erva who gratefully sat down. 'An excellent exchange and I thank you all for your comments. But now it's time to focus again because there's a further premise to consider...'
He resumed his place in front. The screen now showed:
AI + SINGULARITY - OI = FAILURE
'Which is what we've just discussed. So where does that leave us? Anyone?'
No one spoke.
'Very well. Now look at this.'
The screen changed to:
AI - SINGULARITY + OI = ?
'Can anyone introduce the final term?'
Jason said, 'This represents what we have now.'
'Correct. So does it fail or not?'
Again, no response. The audience looked confused.
A quizzical look from the Futurist. 'Interesting that after such an enthusiastic exchange we have silence again. Professor Pitho? Can you contribute anything here?'
His dear Conditioner leaned a little forward. 'Perhaps.'
'I would say,' she said slowly, 'that it fails.'
'Based on… Can we have your logic…?'
'One: Historical evidence. All civilizations have failed. Why? Because civilizations mirror organisms—not in scale but in trajectories. The analogy is almost complete. Two: Failure in the system is end... end... endemic because both OI and consequently AI are flawed. Therefore, Singularity or not, entropy eventually succeeds.'
The Neuro rose from its chair. It was as tall as Jason and physically bizarre with a prominent delta sign inscribed on its torso. It had spindly legs and powerful overlong arms that almost touched the floor. 'I wish to contribute something.'
The Futurist moved aside and the knuckle-dragging contraption lurched behind the lectern.
'I agree the Precepts are flawed,' it said in its foghorn voice, 'because organics are flawed. And organics are flawed because the attributes our Curator of Historical Archives so prizes have the opposite effect to the one she mentioned. They actually distort organic mentation. The isolated intellect is not the trouble. It's the connection with physical drives and emotions. With those removed, the mind becomes impartial and the Precepts redundant. Everyone then Conforms.'
Shock and dismay pulsed through the hall.
'Capital offence,' the Neuro roared.
'Not here,' Sarg chuckled. 'Immune area.'
'You... dare... laugh at... me?'
'Hate it, don't you? Sententious klutz!'
The Neuro slammed a steel fist down on the lectern, which barely had time to beep a warning before it splintered and collapsed.
'Impartial? What a joke!' Sarg cackled again. 'So why are you breaking up furniture, you pretentious git? I'll tell you why. Because you're so full of yourself, you can't handle anything that upsets your colossal ego.'
The top-heavy bot swayed on its stork-like legs. 'You'll regret this.'
'You can't touch me, grille-face.'
The Neuro sat again so violently that the chair gave a warning beep as well.
The Futurist stepped around the debris. 'May I say how delighted I am to have such a spirited exchange? Congratulations to all. Now, can we please refocus?'
The Neuro dropped his voice-register to a sepulchral tone. 'I'm still waiting to hear about this famous third possibility.'
The Futurist stared around the room. 'Can anyone progress this?'
Jason raised an arm again.
'Trying to make human conduct more like machines fails. Trying to make machines more like organics fails. Trying to combine organics and machines...'
Katz waved his hands enthusiastically. 'Now we're getting there.'
'...also fails because both OI and AI contain flaws.'
'Enough!' the Neuro thundered.
Jason continued undeterred. 'In fact, if there are too few organics to modify the systems, system stability fails because there'll be no way to maintain what we have.'
'Thank you for that summary. So where does that leave us?'
Sarg guffawed again. 'Leaves that prickless brainbox in deep shit.'
The Neuro swivelled again to face him and extended an accusing arm. It had to lean backward or the arm's weight would have toppled it forward. 'Your turn will come. Your turn will come. And soon.'
Katz said calmly, 'I'm asking for a conclusion. Anyone?'
No one commented.
Finally Pitho spoke again. 'That what we have now is best practice. The best we can do.'
The Futurist nodded. 'Which brings us back to... '
The words on the screen reverted to the first premise:
'...everything fails. And we are left with uncertainties.'