A chemical bomb can level a building while a nuclear bomb can destroy a city. How can we get a handle on this difference in energy, even in just a rough way?
Take the hydrogen atom, with a positively-charged proton bounded to a negatively-charged electron in its lowest energy level, called the ground state. That is, as close to the proton as can be allowed by quantum mechanics, since if it were further away in a higher energy level it could jump down to the ground state (or do a series of intermediate jumps depending on which energy level it’s in) and emit a photon). How much energy is required to rip this electron out of its ground level and so end up with a free proton and free electron? The answer is 13.6 eV, which we’ll take as a rough measure of the energy available for chemical reactions, since chemistry is, at the end of the day, all about shuffling the outermost electrons in atoms around.
(We use units of eV here, which stand for electron volt, which is the amount of kinetic energy a free electron would have if accelerated across a potential of one volt. It is equal to in SI units of joules, where a joule is of course the energy required to exert a force of one newton across one meter.)
How much energy is there in a typical nuclear reaction? In a star like our Sun, protons are fused into helium nuclei via proton-proton (pp) chains. We’ll deal with the most common here, the pp I chain.
A series of nuclear reactions occur, resulting in the generation of a helium nucleus (or alpha particle), two positrons, two photons, and two neutrinos for four reactant protons. The energy released (in the form of the photons) is 26.7 MeV. That’s 26,700,000 eV, which you can compare to the 13.6 eV of the hydrogen ionization energy.
We see clearly that a nuclear reaction involves orders of magnitude higher energy levels than a chemical reaction, which is why there is so much promise and peril in nuclear technology.