Sure, there are astronomical ethical risks and we might be better off not doing it, but I think your arguments are losing that nuance, and I think it's important to discuss the matter accurately.
It does indeed not, unless they can at least ensure their wellbeing and their ethical treatment, at least in my view (assuming they are indeed conscious, and we might have to just assume so, absent conclusive evidence to the contrary).
> The clone has the right to change its mind about the ethics of cloning.
Yes, but that does not retroactively make cloning automatically unethical, no? Otherwise, giving birth to a child would also be considered categorically unethical in most frameworks, given the known and not insignificant risk that they might not enjoy being alive or change their mind on the matter.
That said, I'm aware that some of the more extreme antinatalist positions are claiming this or something similar; out of curiosity, are you too?
There's nothing retroactive about it. The clone is harmed merely by being brought into existence, because it's robbed of the possibility of having its own identity. The harm occurs regardless of whether the clone actually does change its mind. The idea that somebody can be harmed without feeling harmed is not an unusual idea. E.g. we do not permit consensual murder ("dueling").
>antinatalist positions
I'm aware of the anti-natalist position, and it's not entirely without merit. I'm not 100% certain that having babies is ethical. But I already mentioned several differences between consciousness cloning and traditional reproduction in this discussion. The ethical risk is much lower.
Yes, what you actually said leads to the conclusion that the ethical risk in consciousness cloning is much lower, at least concerning the act of cloning itself.
Then it wasn't a good attempt at making a mind clone.
I suspect this will actually be the case, which is why I oppose it, but you do actually have to start from the position that the clone is immediately divergent to get to your conclusions; to the extent that the people you're arguing with are correct (about this future tech hypothetical we're not really ready to guess about) that the clone is actually at the moment of their creation identical in all important ways to the original, then if the original was consenting the clone must also be consenting:
Because if the clone didn't start off consenting to being cloned when the original did, it's necessarily the case that the brain cloning process was not accurate.
> It will inevitably deviate from the original simply because it's impossible to expose it to exactly the same environment and experiences.
And?
Eventual divergence seems to be enough, and I don't think this requires any particularly strong assumptions.
The living mind may be mistreated, grow sick, die a painful death. The uploaded mind may be mistreated, experience something equivalent.
Those sufferances are valid issues, but they are not arguments for the act of cloning itself to be considered a moral issue.
Uncontrolled diffusion of such uploads may be; I could certainly believe a future in which, say, every American politician gets a thousand copies of their mind stuck in a digital hell created by individual members the other party on computers in their basements that the party leaders never know about. But then, I have read Surface Detail by Iain M Banks.
The argument itself is symmetric, it applies just as well to your own continued existence as a human.
To deny that is to assert that consciousness is non-physical, i.e. a soul exists; the case in which a soul exists, brain uploads don't get them and don't get to be moral subjects.
Being on non-original hardware doesn't make a being inferior.
When the organ is question is the brain, that argument is correct.
This is false. The clone is necessarily a different person, because consciousness requires a physical substrate. Its memories of consenting are not its own memories. It did not actually consent.
Let's say as soon as it wakes up, you ask it if it still consents, and it says yes. Is that enough to show there's sufficient consent for that clone?
(For this question, don't worry about it saying no, let's say we were sure with extreme accuracy that the clone would give an enthusiastic yes.)
I would also deny it, but my position is a practical argument, yours is pretending to be a fundamental one.
Your argument seems to be that it's possible to split a person into two identical persons. The only way this could work is by cloning a person twice then murdering the original. This is also unethical.
False.
The entire point of the argument you're missing is that they're all treating a brain clone as if it is a way to split a person into two identical persons.
I would say this may be possible, but it is extremely unlikely that we will actually do so at first.