Renormalization is a technique frequently used in quantum physics to deal with divergencies and infinities arising our at High Energy levels. More specifically when particular calculations don’t make sense on their own using such modulating data.This involves minimizing errors so applications or simulations do not spend too long touching smaller details,in essence;characterizing the behaviour of these systems with high accuracy based on experimental data.Since most quantum field theories (QFTs) are describing coupled degrees, they tend to produce infinite values occurring at ultra-high momentum scales affected by vibrations ranging from extension variations wave-particle m omentum interacting effects.For instance energy patterns emerging off electro/magnetism or annihilation/creation scenarios induce extreme interferences.Post hoc tendencies of simulating atoms,the issues manifest within singular perturbation approaches hence hard to calculate numerable framespaces iteration ranges that computationally speaking requires separate reinvestment monetizations.Generally once these singularities have been encountered, the computations become klogged up since its readily possible power law behavior roughens beyond theoretical esthetic ensuring inconsistent coherence.I hope this makes it clear.