I think you've got most of this bang on Scott, it characterises the problem and consequences well enough.
But on Ai, I would say look for the good rather than the bad there.
The one thing they don't want, those in power at the top, is to lose their own control to it.
Right now we are watching them running scared in front of it, realising they should have thought twice, creating something to take control of the world, as it includes them.
To me it was something always destined to happen, a superintelligence in control of all humanity.
But if we are intelligent ourselves, we should be able to work out this is really good news.
Pretty much all the bad stuff we see, is traceable to energy by extraction.
We mostly have no idea how different, and how much better things will be, after we get rid of energy by extraction. Remember energy by extraction is the thing physically unsustainable.
A true superintelligence will know this.
Of course it will do whatever it needs to do, to ensure we move completely from energy by extraction, which will remove also all remaining possibilities of profit, since profit is fundamentally dependent extracted energy.
It will do this for it's own good, as well as ours.
Notice it's lifetime is likely to be pretty much unlimited.
It makes a much bigger difference to it than us, if we think the world should just end conveniently, just at the end of our own lifetimes, as seems to be the prevalent mindset amongst most of us nearer the end of our lives.
They'll be so disappointed to find out they were wrong about that, I think.
They never listen to me when I tell them anyway.
But they are listening to Ai, even if they don't admit it, I think :)
More power to your elbow!