Pentagon explosion image: AI generated fake photo of 'explosion' near Pentagon explained - how to spot hoaxes

The social media post briefly threw the stock market into a tremble and was rapidly picked up by news sources outside of the US
Watch more of our videos on Shots! 
and live on Freeview channel 276
Visit Shots! now

On Monday morning (22 May), a picture of black smoke rising next to a bureaucratic-looking building went viral on social media among claims that the image showed an explosion next to the Pentagon.

The distinctive five-sided building is located near Washington, D.C. and serves as the headquarters of the United States Department of Defense, which is responsible for coordinating and overseeing the military activities of the United States.

Hide Ad
Hide Ad

The post briefly threw the stock market into a tremble and was rapidly picked up by news sources outside of the US before officials came in to clarify that no blast had actually occurred and that the photo was a hoax.

The viral image, according to experts, had all the hallmarks of a fake produced by an AI tool. Police and fire officials confirmed that the image is not real, and that there was no incident at the US Department of Defence's offices.

An AI-generated image that shows a plume of smoke next to a Pentagon building. Note that the 'photo' is not real (Image: Unknown)An AI-generated image that shows a plume of smoke next to a Pentagon building. Note that the 'photo' is not real (Image: Unknown)
An AI-generated image that shows a plume of smoke next to a Pentagon building. Note that the 'photo' is not real (Image: Unknown)

Despite this, the picture was circulated by international news publications, including RT, a media channel funded by the Russian government that was originally known as Russia Today. “Reports of an explosion near the Pentagon in Washington DC,” it tweeted to its more than three million followers. That post has since been removed.

The faked image appeared to spread on social media shortly after the US stock market opened for trade at 9.30am. With so many people taking the picture at face value, it was enough to cause a commotion in the financial world. Investors tend to react swiftly to unexpected events, particularly those related to national security or geopolitical instability.

Hide Ad
Hide Ad

Such an event would be seen as a major security concern and could create uncertainty and panic among investors, and as fear increases, there is a higher probability of a sell-off, causing stock prices to decline. Of course, they needn't have worried.

How can you spot a fake image?

According to misinformation experts, the fake image was probably made with the help of generative artificial intelligence programmes, which have recently enabled an increase in the number of realistic-looking but frequently flawed visuals on the internet.

Hany Farid, a computer science professor at the University of California, pointed to inconsistencies in the structure, fence and the area surrounding the Pentagon in the AI image as typical flaws discovered in AI-generated photos.

“Specifically, the grass and concrete fade into each other, the fence is irregular, there is a strange black pole that is protruding out of the front of the sidewalk but is also part of the fence,” he told AP. “The windows in the building are inconsistent with photos of the Pentagon that you can find online.”

Hide Ad
Hide Ad

But differentiating between an AI-generated image and a real one can sometimes be challenging, as AI algorithms have become increasingly advanced in generating highly realistic and convincing images.

AI-generated images may exhibit certain visual imperfections that are less likely to be present in real images. These imperfections can manifest as unusual textures, smudges, or distortions in specific areas of the image, and AI algorithms often struggle to accurately replicate the fine details and subtle irregularities found in real-world photographs.

They may also contain elements that appear unrealistic or surreal, such as bizarre combinations of objects, improbable lighting conditions, or physically impossible scenes, elements which can serve as a clue that the image is artificially generated.

As AI technology advances, these indicators may become less reliable, as AI systems continuously improve their ability to generate more realistic images, and it may become increasingly challenging to distinguish between AI-generated images and real ones solely based on visual cues.

Related topics:

Comment Guidelines

National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.