What can we quantum-learn in the age of noisy quantum computation? Both more and less than you think. Noise limits our ability to error-mitigate, a term that refers to near-term schemes where errors that arise in a quantum computation are dealt with in classical pre-processing. I present a unifying framework for error mitigation and an analysis that strongly limits the degree to which quantum noise can be effectively `undone' for larger system sizes, and shows that current error mitigation schemes are more or less as good as they can be. After presenting this negative result, I'll switch to discussing how noise can be a friendly foe: non-unital noise, unlike its unital counterparts, surprisingly results in absence of barren plateaus in
quantum machine learning.