The parallel efficient global optimization (EGO) algorithm was developed to leverage the rapid advancements in high-performance computing. However, conventional parallel EGO algorithm based on ...
Abstract: Selecting an appropriate step size is critical in Gradient Descent algorithms used to train Neural Networks for Deep Learning tasks. A small value of the step size leads to slow convergence, ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results