Memory-Augmented Reinforcement Learning for Hierarchical Graph Optimization of Dynamic Bills of Materials in Sustainable MRI Product Families
Abdelaziz Guelfane
Abstract
Medical imaging devices exhibit complex hierarchical Bills of Materials (BOMs) whose composition evolves over time due to supply disruptions, design refreshes, and regulatory changes. We address dynamic BOM optimization for MRI product families under climate and cost objectives. We propose a memory-augmented reinforcement learning (RL) framework that operates over a hierarchically clustered dependency graph of parts and assemblies. The agent uses an external memory to encode temporal intra-node dynamics and long-horizon consequences of merge/split/reassignment actions. On real and synthetic BOMs, our approach improves part reuse and reduces lifecycle carbon footprint compared to strong baselines (heuristics, flat GNN+RL, and no-memory ablations). We report relative gains of over 20\% in reuse ratio and approximately 30\% in LCA CO$_2$ reductions under disruption scenarios of 18\%. Our results indicate that hierarchical structure and temporal memory are key for robust, climate-aware product family optimization in healthcare manufacturing.
Chat is not available.
Successful Page Load