Machine Learning and Computer Vision
Coupled map lattices are a discrete time, continuous state recurrent network, with a single channel convolution step (termed “coupling” in CML literature) followed by a nonlinear map as activation function. Complex pattern formation dynamics and phase regimes were studied in computational statistical mechanics.
In work performed at U. Texas at Austin from 1993-2001 I studied their use in computer vision and pattern recognition, and as a model canonical column computation, where E:I networks at each site and lateral interactions result in similarity mappings and rotation invariant representations of objects by discovering time varying parameters in the activation function and couplings.
Machine Learning using spatio-temporal recurrent convolutional networks with logistic map activation and time-varying network coupling and nonlinearity parameters.
This paper is trying to explain the network performance as a modulation of effective dimensionality, in addition to reviewing the basic operation described in other papers.
DeMaris, D. Dimension Change, Coarse Grained Coding And Pattern Recognition In Spatio-Temporal Nonlinear Systems, J. of Integrative Neuroscience, V2:1, 2003, pp 71-102
Several contemporary developments in machine learning including metric learning, contrastive embedding, non-monotonic activation functions, and learning of activation function parameters were present in this work. These were not particularly highlighted, as contemporary trends in ML stressed dimensionality changes, and I was most interested in showing a novel role for cross frequency coupling and feedback connections to upstream recurrent layers in biological systems. Specifically that feedback connections may modulate the representational state in inference computations.
Note: The literature on coupled map lattices uses synchronization to refer to the focus (sparse, low entropy, multimodal) or dispersion (high entropy distributions) in the distribution over lattice sites, in contrast to the usual engineering and neuroscience sense phase synchronization or correlation in a time series. Synchronization in CMLs is a continuous value – in the high coupling limit all sites are totally synchronized (correlated). However, when the papers discuss neural processing it is intended that processing intervals (i.e. the coupling and lateral information transfer) state changes, and slow dynamics changes in the two stage learning process are phase-synchronized.
A concise description of the algorithm in the dissertation. The idea of synchronization opponents was my grasping for something like an information bottleneck principle achieved by effective dimensionality changes in time-varying networks, with cross frequency coupling playing a control role. The objective function used minimizes in class distance while maximizing the sum of distance to previously learned classes, thus the work is an early example of metric learning in the neural network world.
Synchronization opponent systems: attractor basin transient statistics as a population code for object representation, Neurocomputing 38-40, 2001, pp. 547-554
Building on the 1995 paper, I use genetic algorithms which attempt to learn a parametrically generated metric space to properly order several parametric curves. While not successful, a parameterization for each curve is found with a unique small set of coupling and bifurcation (activation function)parameters. A second experiment uses a competitive ensemble of recurrent networks for recognizing objects in depth. Each is trained with few shot learning to have an invariant distribution for training views, while maximizing KL distance to previously learned objects. The number of partitions used in the space is on order of the number of learned objects. Projecting the representations to 2D shows that objects with similar topology (bends of paperclip objects) cluster together. Learning is accomplished with no BPTT, only by exploring a 6 valued parameter space with a genetic algorithm. Each site in the network is considered as a model excitatory-inhibitory network with modulated parameters changing the state flows between partitions, resulting in alternating convolution and fully connected layers in an unrolled network.
DeMaris, D. Synchronization Opponent Networks: Dynamics, Computation, and Coding For Similarity and Object Recognition. Dissertation, University of Texas at Austin, 2001
This paper reviews relationships between dwell times and the size of the cube, proposes a model of 3 recurrent layers, with one layer being the unstable coding of near and far space which emerges from input driven network changes.
DeMaris, D. Attention, Depth Gestalts, and Chaos in the Perception of Ambiguous Figures, in Levine, D and Brown, Vincent R. (Eds.) Oscillations in Neural Systems, Lawrence Erlbaum Associates, 2000, p. 239-25
The following paper examines unsupervised learning or dimensionality reduction with a two stage process from critical (near chaotic or edge of chaos) dynamics to convergent dynamics, with the representation as the distribution over partitions in the input space. It can be considered as a similarity preserving hash or dimensionality reduction.
DeMaris, D.Computing Shape Similarity With Chaotic Reaction Diffusion Spectra, Proc. World Congress on Neural Networks, V1 p. 270-273, 1995
Machine Learning and Semiconductor Design and Processing
F. de Morsier, N. Casati, D. L. DeMaris, M. Gabrani, A. Gotovos, R. A. Krause. Fast detection of novel problematic patterns based on dictionary learning and boundary detection of failure regions Optical Microlithography XXVII 9052, 90520J, 2014
F de Morsier, D DeMaris, M Gabrani, N Casati. Fast detection of novel problematic patterns based on dictionary learning and prediction of their lithographic difficulty, Optical Microlithography XXVII 9052, 905211
Nathalie Casati, Maria Gabrani, R. Viswanathan, Z. Bayraktar, D. L. Demaris; A. Y. Abdo, J. Oberschmidt, R.A. Krause. Automated sample plan selection for OPC modeling, (accepted,SPIE Advanced Lithography 2014)
D. DeMaris, M. Gabrani, et al. Fast source independent estimation of lithographic difficulty supporting large scale source optimization, SPIE Optical Lithography, V8325 2012
K. Lai, M. Gabrani, D. DeMaris et al. Design specific joint optimization of masks and sources on a very large scale, Proc. SPIE Advanced Lithography, V7973, 2011