Abstracts for Ph.D. Dissertations ___________________________________________________ Dynamic Representation of Musical Structure Ph.D. Dissertation by Edward Wilson Large 1994 No abstract. ____________________________________________________ The Evolution of Communication in Adaptive Agents Ph.D. Dissertation by Gregory M. Saunders Adviser: Jordan B. Pollack, 1994 The field of adaptive behavior holds that higher-level cognitive skills arise from the more primitive ability of an agent to adapt to its environment. Although many behaviors have been studied in this bottom-up fashion (e.g., obstacle avoidance, wandering, environment exploration, food collection, planning, predator avoidance, locomotion, action selection, flocking, etc.), relatively few people have studied communication as adaptive behavior. In this dissertation, I explore how communication can be understood as an adaptation by agents to their environment. I begin by looking at behavior-based methods for agent evolution, and propose a connectionist version of subsumption which supports learning. I reject this as a method of evolving communication, however, because of certain assumptions the approach requires. I then turn to evolutionary programming, a population-based search technique, and show how it can be used to evolve agents with far fewer assumptions. With this background, I move on to the evolution of communication. I begin with a set of independent agents, instantiated as recurrent neural networks. After arguing against systems which use discrete symbols to evolve communication, I supply my agents with a number of continuous communications channels. The agents use these channels to initiate real- valued signals which propagate through the environment, decaying over distance, perhaps being perturbed by environmental noise. Initially, the agents' signals appear random; over time, a structure emerges as the agents learn to communicate task-specific information about their environment. I demonstrate how different communication schemes can evolve for a task, and then discover a commonality between the schemes in terms of information passed between agents. From this I discuss what it means to communicate, and describe how a semantics emerges in the agents' signals relative to their domain. ___________________________________________ Determinism, Nondeterminism, Alternation, and Counting Ph.D. Dissertation by Sanjay Gupta, Ph.D. The Ohio State University, 1994 Professor Timothy J. Long, Adviser The goal of complexity theory is to determine the amount of computational resources needed to solve various problems. Structural complexity theory attempts to determine the computational resources needed to solve problems in various complexity classes and to determine the relationships among complexity classes. This approach to complexity theory is less adhoc than consideration of individual problems in isolation. In this dissertation we have considered three sets of results, all related to increasing our understanding of the structure of various complexity classes and relationships among them. The three sets of results that are explored are: 1. Separations of time complexity classes. 2. Closure properties of several function classes. 3. Tools for embedding one complexity class in another. __________________________________________________ Ph.D. dissertation by John Kolen This dissertation addresses the issues surrounding the computational capabilities of recurrent neural networks. My results apply not only to simple recurrent networks, Jordan networks, and higher order recurrent networks, but many other networks implemented as input-parameterized iterated fuctions. The following reasons have driven my efforts to understand their computational capabilities. First, the question of knowledge content arises whenever we attempt to understand how a given network produces its behavior. Second, knowing the range of what is computable by a recurrent network can guide us in their intelligent application. Finally, this knowledge may also help us to develp new training strategies which bias the network towards desirable solutions. While we already know that recurrent networks can perform complex computation by simulating machine tapes and stacks, one problem still remains: someone designed each universal-computing network by hand. We know the function decomposition because the designer can tell us what they intended each part to do. Unfortunately, weak learning methods, like back-propagation, that discover operable network weights cannot explain the internal functionality of final product. Thus, we are forced to externally determine the recurrent network's computation process by observing its structure and behavior. To this end, I identify three facets of recurrent networks that directly affect their emergent computational descriptions: system dynamics, input modulation of state dynamics, and output generation. System dynamics, the mapping of current state to next state, have been traditionally considered the source of complex behavior. INput modulation occurs as a finite set of input vectors and induces ??beiterated function system-like behavior from the recurrent network. This selection creates state space representations for information processing states which display recursive structure. I show that the mechanism producing discrete outputs has dramatic effects on the resulting system complexity by imposing information processing regularities in the output stream strong enough to manipulate both complexion (number of states) and generative class of the observed computation. As for new training methods, I outline a method of network training called entrainment learning which offers a novel explanation of the transmission of grammatical behavior structures between agents. _____________________________________________ Masters Thesis by Roshan M. Rao "Utilization Imbalance in Wormhole Routed Networks" Autumn 1994 (Adviser: D.N. Jayasimha) Wormhole routing is a popular switching technique for interprocessor communication in multiprocessor systems. It has been experimentally observed that wormhole routed networks saturate at low to moderate loads. Adaptive routing algorithms with multiple virtual channels do not help very much in improving the saturation behavior, since the restrictions they place on the choice of channels for achieving deadlock freedom create an imbalance in the utilization of channels. This imbalance forces a few links to attain full utilization first and the entire network saturates, though most of the links have yet to carry their full capacity of traffic. In this thesis, we study this problem analytically, define a way to measure imbalance, and use the results of this analysis to present an output selection policy to alleviate imbalance. Reducing imbalance has the twin benefits of improving the saturation behavior of the network and reducing message latencies. We substantiate this claim through simulation by applying this policy to two routing algorithms.