Cet ouvrage est uniquement disponible en version électronique.
Free delivery for any order placed directly through the ISTE Group website istegroup.com
Delivery time: approximately two weeks
Deliveries only within metropolitan France, Belgium, Switzerland and Luxembourg
Printed in color
An ebook version is provided free with every hardcopy ordered through our website
It will be sent after the order is completed
Offer not applicable to bookshops
Prices reserved for private individuals
Licenses for institutions: contact us
Our ebooks are in PDF format (readable on any device)
This book proposes a transverse mathematical perspective of deep machine learning in artificial intelligence, and to do so, it develops a framework of generalized transformations, called multiserial and hyperserial decompositions, in order to unify standard and recent data representation spaces. The generalization consists of integrating expressions of several variants of convolutional neural networks and wavelet filter banks in the same analytical framework.
The integrated expressions are derived recursively, from downstream to upstream layers, to show the sequence of features returned at the nodes of a network model architecture. The inspiring framework for the derivation of these expressions is that of M-band convolution filter banks. Inter-layer inter-node expressions are provided, and activation sequences of convolutional neural networks are mathematically described by suitable algebraic path representations.
The topics covered address mathematical optimization, generalized functions and functional analysis, focusing on convolution integrals, probability entropy, statistical models and convolutional neural compositions.
(FR) 1. The Minimum You Need to Know About Optimization
2. The Essentials to Know in Functional Analysis
3. Probability Entropy and Neural Statistical Parameterizations
4. Convolutional Neural Networks