Dongpo Xu; Yili Xia; Danilo P. Mandic, Optimization in Quaternion Dynamic Systems: Gradient, Hessian, and Learning Algorithms, Neural Networks and Learning Systems, IEEE Transactions on, Vol. 27, Iss. 2, pp. 249-261, 2016, DOI: 10.1109/TNNLS.2015.2440473
Abstract: The optimization of real scalar functions of quaternion variables, such as the mean square error or array output power, underpins many practical applications. Solutions typically require the calculation of the gradient and Hessian. However, real functions of quaternion variables are essentially nonanalytic, which are prohibitive to the development of quaternion-valued learning systems. To address this issue, we propose new definitions of quaternion gradient and Hessian, based on the novel generalized Hamilton-real (GHR) calculus, thus making a possible efficient derivation of general optimization algorithms directly in the quaternion field, rather than using the isomorphism with the real domain, as is current practice. In addition, unlike the existing quaternion gradients, the GHR calculus allows for the product and chain rule, and for a one-to-one correspondence of the novel quaternion gradient and Hessian with their real counterparts. Properties of the quaternion gradient and Hessian relevant to numerical applications are also introduced, opening a new avenue of research in quaternion optimization and greatly simplified the derivations of learning algorithms. The proposed GHR calculus is shown to yield the same generic algorithm forms as the corresponding real- and complex-valued algorithms. Advantages of the proposed framework are illuminated over illustrative simulations in quaternion signal processing and neural networks.
Source: Email from ieeetnnls_AT_gmail.com, 2 Feb. 2016, http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=7124503