Various simple transport models of electron temperature in a confined plasma are reducible to the quasilinear equation ρ(x)ut = [c(x)uxn]x + A (x)us, - 1 < x < 1, u(± 1) = 0. u is the temperature, ρ(x) the density, and c = g[ρ(x)] the density-dependent part of the thermal diffusion. ρ(x) and c(x) may vanish at the plasma edge, rendering the problem singular. The temporal behavior depends critically on the boundedness of R = ∫ -1 +1c-1(x) dx. If R < ∞ then in the absence of heat sources, A ≡0, every initially given state u(x, 0) evolves toward an algebraically decaying, universal space-time separable solution. Its existence and uniqueness is proved. The method developed in this work may be used to show the equilibrization of the solution in the presence of a heat source of the form A (x)us, s < n, γ(x) > 0. On the other hand, if R = ∞ and A = 0 then the system becomes isothermalized: u→ū = ∫-1+1u(x, 0)ρ(x) dx/∫1-1 ρ(x) dx > 0. In such a case addition of heat sources will cause a thermal explosion.