AB,
The circular reasoning referred to in the homunculus link is entirely derived from the need for continuous "cause and effect" events as seen in observed material behaviour.
Sort of – what it actually does though is to explain that you cannot solve a (supposed) problem by just relocating it. If for some reason you think the naturalistic paradigm for decision-making is impossible, then just asserting something else to do the decision-making instead answers nothing.
What I continue to point out is the impossibility for all our perceived conscious control to be entirely derived from material reactions over which we can have no conscious control.
No, you don’t “point that out” at all – you
declare it and
assert it, but no matter how many times you’ve been asked you’ll never, ever
justify that claim with an
argument.
Do you honestly believe that my ability to deduce the formula for the length of a spiral could have been formed by sub conscious brain activity before it popped into my conscious awareness?
I see no reason not to, but what I believe about this is neither here nor here.
You’re the one who tells us
you’re not convinced by it,
you’re the one who jumps straight from “I’m not convinced” to “therefore it’s impossible” so it’s
your burden of proof to find an argument to take
you from the former to the latter.
This involves far more that the bland assumption of personal incredulity.
But as personal incredulity is all you’ve ever managed to produce here (“I’m not convinced”) why should anyone take that claim seriously? What is this supposed “far more” of which you speak? In the unlikely event that you want to try at least to answer that, try to remember too that as soon as you frame your reply as a question about what
other people think you’ve negated the effort.
Just as God is the ultimate source of all that exists,…
Fallacy of reification. What “God” would that be that you’ve just inserted into your claim?
… I am the ultimate source of my consciously driven directives. No need for endless chains of "invisible men" or turtles.
What do you mean by “I” here – the automaton Alan Burns you think there to be? The invisible little homunculus at the controls you think does your decision-making for you and then pulls some magic levers to make “you” do as your told? What?
And, yet again, if you do still want to cling to your little man at the controls notion how would you then propose to avoid the homunculus fallacy?