While realistic detector simulations are an essential component of particle physics experiments, current methods are computationally inefficient, requiring significant resources to produce, store, and distribute simulation data.In this work, we propose the Intra-Event Aware GAN (IEA-GAN), a deep generative model which allows for faster and more resource-efficient simulations. We demonstrate its use in generating sensor-dependent images for the Pixel Vertex Detector (PXD) at the Belle II Experiment, the sub-detector with the highest spatial resolution. We show that using the domain-specific relational inductive bias introduced by our Relational Reasoning Module, one can approximate the concept of a collision event in the detector simulation. We also propose a Uniformity loss to maximize the information entropy of the IEA-GAN discriminator's knowledge and an Intra-Event Aware loss for the generator to imitate the discriminator's dyadic class-to-class knowledge. We show that the IEA-GAN not only captures the fine-grained semantic and statistical similarity between the images but also finds correlations among them, leading to a significant improvement in image fidelity and diversity compared to the previous state-of-the-art models.