算法展开网络在压缩感知图像重建应用中取得了巨大成功,但这些网络中未充分挖掘和利用图像的通道及空间信息的深层学习潜力,同时在重建精度和成本控制等方面都有待进一步深入探索和改进。为了实现对图像的迅速采样并从有限采样数据中准确重建图像,本文提出了一种基于邻近梯度算法展开的双域学习网络PGD-DDLN。该网络将邻近梯度算法的两步更新迭代分别展开到深度网络架构中,并在网络中加入了对图像通道和空间信息的双域学习过程。大量实验表明,我们的PGD-DDLN网络在定量指标和视觉质量方面都达到较为先进的结果。Algorithm unfolding networks have achieved great success in compressive sensing image reconstruc-tion applications, yet these networks have not fully exploited and leveraged the deep learning potential of image channel and spatial information. Additionally, there is still a need for further exploration and improvement in areas such as reconstruction accuracy and cost control. To achieve rapid image sampling and accurate image reconstruction from limited sampled data, this paper proposes a Dual-Domain Learning Network based on the Unfolded Proximal Gradient Algorithm, termed PGD-DDLN. This network unfolds the two-step update iterations of the proximal gradient algorithm into a deep network architecture and incorporates a dual-domain learning process for image channel and spatial information. Extensive experiments demonstrate that our PGD-DDLN network achieves state-of-the-art results in both quantitative metrics and visual quality.
针对非光滑非凸–强拟凹鞍点问题,本文利用Bregman距离建立了Bregman近端梯度上升下降算法。对Bregman近端梯度上升迭代算法中,得到内部最大化问题函数差值不等式,从而得到近端梯度上升迭代点之间的不等式关系。对于非凸非光滑问题,引入扰动类梯度下降序列,得到算法的次收敛性,当目标函数为半代数时,得到算法的全局收敛性。For the nonsmooth nonconvex-strongly quasi-concave saddle point problems, this paper establishes the Bregman proximal gradient ascent-descent algorithm by using the Bregman distance. In the Bregman proximal gradient ascent iterative algorithm, the difference inequality of the internal maximization problem function is obtained, and thus the inequality relationship between the proxi-mal gradient ascent iterative points is derived. For nonconvex and nonsmooth problems, a perturbed gradient-like descent sequence is introduced to obtain the sub-convergence of the algorithm. When the objective function is semi-algebraic, the global convergence of the algorithm is obtained.