There are people today who could create such a pathogen, but not many. Widespread access to powerful AI risks lowering the bar enough that we get overlap between "people who want to kill us all" and "people able to kill us all".
This is not a gotcha argument, this is what I work full time on preventing: https://naobservatory.org The world must be in a position to detect attacks early enough that they won't succeed, and we're not there yet.
It's not enough for a handful of people to predict something. You have to get the entire nation onboard to defend against it.
When you only allow gov and big tech access to powerful AI, you create a much more dangerous and unstable world.
Centralizing power is dangerous and leads to power struggles and instability.