I have a scene in Half Past Midnight in which my protagonist first sees the orbital nuclear burst that heralds the beginning of the D-day war. My concern was that I wanted to have some basis of comparison for the apparent diameter of the explosion. I’m hoping someone sees this post and has some insight as to whether or not my logic is flawed. Here’s how I look at it:
My plot assumes that someone has managed to disguise a few ten megaton warheads as part of the payload on some civilian photography satellites. The orbital range of these satellites is anywhere from 300 to 600 miles above sea level. For the purposes of the novel, I have assumed a 320 mile altitude. I have also found that a high altitude burst in the ten megaton range (typically considered to be anything above thirty kilometers – or 18.64 miles) yields a fireball of approximately a two and a half mile diameter. I’ve given myself a little leeway here and estimated that without any atmosphere and lower gravity, there may be an additional girth to the blast. For the sake of simplicity, I’ve given myself an additional half mile and assumed an even three mile diameter.
Since I need a visual reference of some sort, and my manuscript already mentions that the explosion is in the sky, ..”about halfway between the morning sun and the horizon”, the sun seems to be the logical reference point. The sun is 864,327 miles in diameter and is 93,000,000 miles away. Basic mathematics tells me that dividing 93,000,000 by 864,327 gives me a result of 107.598 “miles”, and dividing 320 miles (the altitude of the proposed nuclear explosion) by the diameter of the blast (three miles) yields a result of 106.666 “miles”.
To me, this seems to indicate that the three-mile diameter blast at an altitude of 320 miles would appear almost the same size as the sun if viewed side by side. My gut tells me I’ve overlooked something, but I can’t figure out what it is. Can anyone else confirm or correct my calculations and/or logic?