Not so long ago, I have come to a rather unpleasant realization that whether a lot of that will happen, will depend heavily on whether the ones currently trying to make technology control every facet of our lives decide to allow society get dumber first ( think Idiocracy, which AI very much could allow ) or not in which case it is anyone's guess, because people will still have some basic skills and memories of what could be.
I am hoping for the best, but life has taught me hard not to bet against humanity's worst instincts.
edit: add whether
The thesis of Idiocracy is that society gets dumber in the future because intelligence is mostly genetically-determined and smarter people systematically have fewer children than dumber people, i.e. literal evolutionary selection against human intelligence over many human generations. This is clear in the first several minutes of the movie. People who recognize that this is what that movie is saying often then condemn it for being Nazi-adjacent pro-eugenics propaganda.
In the logic of Idiocracy, the way that an AI would "allow" the future society portrayed in the movie is by letting dumb people systematically have more kids than smart people, and "not allowing" this would entail some kind of coercive eugenics policy aimed at getting smart people to have more kids than they would otherwise be inclined to.
100%. Same applies to any hypothetical sentient AI that may or may not arise. The incentives to keep everyone weak and dumb are too strong.
I have a friend in a position of some influence, and am currently trying to persuade them to stop being so comfortable trusting in humanity to come to the right decisions for exactly that reason.