How do you quantify and measure this productivity gain?
There were articles as late as the late 1990s that suggested that investing in IT was a waste of money and had not improved productivity.
You will not see obvious productivity gains until the current generation of senior engineers retires and you have a generation of developers who have only ever coded with AI, since they were in school.
Eventually companies figured out how to use them effectively and eventually useful software was created. But, at the start of the whole thing, there was a lot of waste.
Quite a lot of people are now paying a lot for ai that makes them produce less and lower quality. Because it feels good and novel.