NSC home > group seminars

Joint Seminar Series List of Speakers

These seminars are run jointly between the Non-Standard Computation Group in Computer Science and the Intelligent Systems Group in Electronics.

Speaker Title and Abstract Date and Location
Jon Timmis Immunising Automated Teller Machines: This talk will present an immune-inspired adaptable error detection (AED) framework for Automated Teller Machines (ATMs). This framework has two levels, one is local to a single ATM, while the other is network-wide. The framework employs vaccination and adaptability analogies of the immune system. For discriminating between normal and erroneous states, an immune inspired one-class supervised algorithm was employed, which supports continual learning and adaptation. The effectiveness of the proposed approach was confirmed in terms of classification performance and impact on availability. The overall results were encouraging as the downtime of ATMs can de reduced by anticipating the occurrence of failures before they actually occur.

PDF of talk.

3rd Nov, 2006. 1315-1415 2006 V/123
Mic Lones Designing Evolutionary Algorithms for Regulatory Motif Discovery:Regulatory motifs capture the patterns of DNA bases responsible for controlling when and where a gene is expressed. Identification of regulatory motifs is both an important problem, since it underlies efforts to understand and reconstruct regulatory networks, and a particularly hard problem, made difficult by the low signal to noise ratio resulting from the poor conservation and relatively short length of regulatory motifs. Evolutionary algorithms offer potential advantages over conventional approaches to motif discovery, particularly in terms of global search, representational flexibility and concurrent discovery of multiple solutions. Nevertheless, standard evolutionary algorithms are beset by problems of diversity maintenance, and this makes it difficult to both preserve multiple solutions and avoid convergence to fit yet biologically meaningless solutions. In this seminar, I will introduce a novel evolutionary algorithm which is able to discover regulatory motifs in relatively long promoter sequences and discover multiple motifs concurrently. I will also discuss the concept of population clustering as an approach to diversity management, and present recent results on using motif-rule co-evolution to discover higher-order regulatory motifs.

PDF of talk.

17th November, 2006. 1315-1415 CS103
Dan Franks Developing a Methodology for Social Network Sampling: Understanding the complex structures of social networks is essential to studies of social behaviour. Social structure has wide ecological and evolutionary implications for important processes such as mate choice, social learning, cooperation, foraging, and disease transmission. It is therefore crucial that scientists use the appropriate tools and methods to study the properties of animal social networks. Despite the wide applicability and high profile of social network research in ecology, there is no established quantitative methodology to guide researchers in efficient and unbiased sampling of social networks. Ecologists attempt to capture network properties by recording observations of a select sample of animals and their social interactions. Their aim is to understand properties of the real network by constructing a sample network whose structure represents that of the real network. Then, the sample network can be analyzed using network theory. The assumption here is that the sample network is structurally equivalent to the real network. For the scientific investigation to have any validity, this assumption must be met. Most theory on collecting network data comes from social science literature where social contacts in human populations are easier to track than those in animal populations. This presents a problem: how can we be confident that the sampled network reliably represents the real-world network? 1st Dec. 2006 1315-1415 P/X/01
Janet Clegg A new crossover technique in Cartesian Genetic Programming Genetic Programming (GP) was introduced by Koza using tree representation and Koza's suggested crossover technique was to randomly swap branches of the parent tree structures. Many people have since found that this type of crossover does not help convergence and even sometimes hinders it. Therefore many people have used GP with mutation only, missing out crossover altogether. Julian Miller introduced Cartesian Genetic Programming (CGP) which uses a directed graph representation instead of the tree structure; but, like in the traditional GP, CGP has normally been used with mutation only and no crossover. The new crossover technique introduced here is based on a slight adaptation of the CGP representation . It has been tested on some simple regression problems and has been found to speed up convergence and reduce the computational effort required to solve these problems.

PPT of talk.

2nd Feb. 2007. 1315-1415 P/T/005
Tim Clarke Control Engineering - Throwing Down a Gauntlet:Since Harry Black's 1927 epiphany on the Lackawanna Ferry whilst travelling to work at the Bell Telephone Labs, the discovery of the effects of feedback has probably had as significant an effect on our lives as the later invention, by John Bardeen, Walter H. Brattain and William Shockley, of the transistor (1948). Society pushes us hard to make technological advances: we want more, and we want it faster, better, cheaper, safer, smaller, cleaner, lighter, and so on. I will show you how and why Control Engineering is a major source of stimulus for technological innovation, and what the current perceived challenges are - challenges that have surprisingly high relevance to interests within IS and NSC. 16th Feb. 2007. 1315-1415 CS103
Gianluca Tempesti Growing processor arrays: how and why? The idea of applying developmental and growth processes to very large arrays of processing elements is fascinating: the possibility of designing circuits that behave like biological organisms seems like a great concept. However, many issues crop up when the implementation of such circuits is considered, issues related to the practical and efficient use of these features. In this seminar, I will try to raise some of these issues, some of which are not necessarily obvious at first sight, and (for some of them) propose some ideas and solutions.

PDF of talk.

2nd March 2007. 1315-1415 P/T/005
Charles Robinson Decentralised Sensor Data Management for Autonomous Systems:Sensing systems play a very important role in many aspects of everyday life, including their use in appliances in our homes, monitoring and supervision systems for manufacturing and processing plants, and navigation systems on aircraft and ships. They have also been used extensively in deep sea and outer space exploration. Such systems often have a selection of sensors for collecting a variety of information, which is usually fed back to some central processor. Decentralised systems, where processing is distributed, have advantages over the more traditional systems in terms of, for example, robustness, timeliness and fault tolerance. A multi-agent system provides a suitable framework for managing operations, such as distributed data handling, needed on such systems. The purpose of my research is to bring together the knowledge in relevant but disparate fields, and to develop a software structure which manages a decentralised multi-sensor system that is capable of fusing gathered data for use in subsequent decision making. In summary we are looking for strategies that are decentralised and scalable, where local decisions and the exchange of information can be made without a central decision maker. It would be a great feat of engineering to produce software that, following installation, learned about its resources and was then capable of monitoring and manipulating its environment in an intelligent and useful manner. 16th March 2007. 1315-1415 P/T/006
Prof Susan Stepney The Neglected Pillar of Material Computation: Biological organisms and processes are often touted as information processing systems, and then analysed in computational terms. But their properties differ in many important ways from our "classical" mathematical-logical computational system formalisms, and the way these are implemented "in silico". In particular, they have an extra important feature: their operation is deeply entwined with the physical and chemical properties of the substrates of which they are composed. Those properties both impose constraints on, and provide capabilities to, the computations being performed. Here I discuss the "missing pillar", of "in materio" computation, that is needed to complement classical computational models, before we can understand biological information processing in full.

PDF of talk.

27th July, 2007. 1315-1415 2006 CS122
Prof Alister Burr Unconventional computing paradigms based on belief propagation: Following on from Susan Stepney's talk on Material Computation, this talk will suggest a paradigm which would be appropriate to the properties of a rather familiar material for computation: the silicon PN junction. This is provided by what is variously known as the sum-product algorithm, message passing or belief propagation, and it provides a means to solve general probabilistic inference problems, such as occur in problems such as image recognition, speech processing, natural language processing, or perhaps even within biological systems - and it is fundamentally different from classical computation. The talk will outline the algorithm, discuss applications, and describe how it neatly exploits the material properties of a silicon substrate - namely the exponential behaviour of the current-voltage characteristic of the PN junction.

PDF of talk.

10th August, 2007. 1315-1415 CS122


Last years seminars Seminar Organiser Jon Timmis