This doesn't match my experience at all. I present you part of a formal language over an AST, no cover functions in sight:
p⍪←i ⋄ t k n pos end(⊣⍪I)←⊂i ⍝ node insertion
i←i[⍋p[i←⍸(t[p]=Z)∧p≠⍳≢p]] ⍝ select sibling groups
msk←~t[p]∊F G T ⋄ rz←p I@{msk[⍵]}⍣≡⍳≢p ⍝ associate lexical boundaries
(n∊-sym⍳,¨'⎕⍞')∧(≠p)<{⍵∨⍵[p]}⍣≡(t∊E B) ⍝ find expressions tainted by user input
These are all cribbed from the Co-dfns[0] compiler and related musings. The key insight here is that what would be API functions or DSL words are just APL expressions on carefully designed data. To pull this off, all the design work that would go into creating an API goes into designing said data to make such expressions possible.In fact, when you see the above in real code, they are all variations on the theme, tailored to the specific needs of the immediate sub-problem. As library functions, such needs tend to accrete functions and function parameters into our library methods over time, making them harder to understand and visually noisier in the code.
To my eyes, the crux is that our formal language is _discovered_ not handed down from God. As I'm sure you're excruciatingly aware, that discovery process means we benefit from the flexibility to quickly iterate on the _entire architecture_ of our code, otherwise we end up with baked-in obsolete assumptions and the corresponding piles of workarounds.
In my experience, the Iversonian languages provide architectural expressability and iterability _par excellence_.
With the code snippets, I tried to show how expressing customer-facing concepts doesn't require more code than detailed, internal concepts. Notice how the semantics captured by each example get progressively "larger" and closer to the frontend.
The phenomenon that developers no longer understand what their code is for after a few months is well known, and it usually concerns code that was written quickly with almost exclusively programming language primitives and very few symbols carrying domain meaning. This is much easier to achieve with highly abstract primitives like those in the Iverson languages.
There's a related phenomenon where developers too slavishly apply 'do not repeat yourself'/DRY, and interconnectedness grows too quickly in a code base that will inevitably become quite large. It is hard to resist this impulse, and in my amateur experience it is even harder in Iverson languages. Maybe I'm wrong and this doesn't happen in practice for some reason, but I've never come across articles about how to avoid it when working in e.g. APL or J so either the problem doesn't manifest or the professionals in these languages doesn't have solutions. Or I just didn't read the right material, which I'm sure you'll correct if that's the case.
I agree that solutions can come out very elegant and that exploratory programming can be very interesting, at least in J, the flavour I know best (in part thanks to the APK), and judging from recorded live programming and lectures. However, what I've heard from people with experience from TakeCare informs my conclusions above. It will be interesting to see whether CGM manages to recruit enough developers to keep it going. They're already advertising in terms of 'do you have some years of experience? have you ever been interested in Python, R or Haskell? come experience real magic in APL with us', so it seems their current employees don't manage to attract enough developers by word of mouth.
The data rarely changes, but you have to put a name on it, and those names are dependent on policies. That's the issue most standard programming languages. In functional and APL, you don't name your data, you just document its shape[0]. Then when your policies are known, you just write them using the functions that can act on each data type (lists, set, hash, primitives, functions,...). Policy changes just means a little bit of reshuffling.
[0]: In the parent example, CustomerConceptNo{127,211,3) are the same data, but with various transformations applied and with different methods to use. In functional languages, you will only have a customer data blob (probably coming from some DB). Then a chain of functions that would pipe out CustomerConceptNo{127,211,3) form when they are are actually need (generally in the interface. But they be composed of the same data structures that the original blob have, so all your base functions do not automatically becomes obsolete.
Funny you mention Common 'do what the hell you want lol' Lisp in the same breath as Clojure and APL.
If I ran a one or two person shop and didn't expect to have to grow and shrink the team with consultants at short notice I might use CL or Pharo.