Friday, 28 January 2000, 12:30 pm, CS202J
 
 

Closed Loop Machine Learning applied to Functional Genomics

Stephen Muggleton
Department of Computer Science
University of York

Abstract:

Machine Learning systems that produce human-comprehensible hypotheses from data are being increasingly used for knowledge discovery within both business and science. These systems are typically open loop, with no direct link between the Machine Learning system and the collection of data.  The alternative of Closed Loop Machine Learning systems is discussed in this talk. This idea is related to that of ``active learning'' in which the machine learning system actively selects experiments to discriminate between contending hypotheses.  In Closed Loop Machine Learning the system not only selects but also carries out experiments in the learning domain.  In particular the EPSRC Closed Loop Machine Learning project will test the following two hypotheses.

Hypothesis 1: Closed Loop Machine Learning efficiently converges to accurate hypotheses.

Hypothesis 2: Closed Loop Machine Learning systems can be physically realised using robotics and successfully applied to a discovery task in functional genomics.

Initial results of the first year of the project will be presented together with plans for the subsequent two years.



 Friday, 04 February 2000, 12:30 pm, CS202J
 
 

Importance Sampling for Stochastic Logic Programs

James Cussens
Department of Computer Science
University of York

Abstract:

Stochastic logic programs (SLPs) are an "upgrade" of stochastic context-free grammars (SCFGs) where probabilities are defined on refutations as the product of the labels on the clauses used in each refutation. Inference, sampling and learning are all more complex with SLPs than with SCFGs, because of the possibility of unification failures and subsequent backtracking.

In this talk I will be (1) giving an overview of some of the existing work on SLPs and related models and (2) describing some ongoing and unfinished work on using "importance sampling" to generate weighted samples from an SLP.



 Friday, 18 February 2000, 12:30pm, CS202J
 
 

    On the Completion of Inverse Entailment for Mutual Recursion and its Application to Self Recursion

           Koichi Furukawa,
Department of Computer Science, University of York
and Graduate School of Media and Governance, Keio University









Abstract:

Muggleton gave an algorithm to augment his original MSH to make it complete. However his algorithm failed to keep the augumented MSH sound. On the other hand, we gave another method to realize completion of the MSH for definite clauses background knowledge, ground-atom positive examples and mutual recursive hypotheses in our previous study. The algorithm aimed to achieve both completeness and soundness of the MSH. However, we found that the algorithm has four drawbacks: 1. it fails to find an appropriate literal to be added when more than one mutual recursions are required in proving the given positive example; 2. it cannot treat cases other than linear recursive hypotheses; 3. it includes an entailment test in the algorithm which prevents the algorithm terminate; and 4. it excludes the treatment of self recursion.  In this paper, we propose a new algorithm for definite clauses background knowledge, ground-atom positive examples and connected mutual recursive hypotheses which avoids the above four defects altogether. It computes a set of maximally specific hypotheses (MaxSH) at least one of which is always subsumed by any sound hypothesis.



Friday, 03 March 2000, 12:30pm, CS202J
 
 

Camera Planning

Jonathan Pickering
Department of Computer Science
University of York

Abstract:

Three dimensional computer graphics are rendered using a virtual camera located in a model world. Currently these cameras are set up by userinput or simple object tracking algorithms, approaches that respectively suffer from user disorientation and lack of generality. An alternative approach is to define properties required in a two dimensional image and use this specification to drive a search for a camera set up. This declarative approach also has the advantage of providing an interface to other packages that may need to generate 3D graphics.

This seminar will show that camera planning can be treated as a real valued constrained optimisation problem Previous attempts at the problem will be described and contrasted. A new approach based on transforming an image description language into constraint and objective functions will be put forward. The constraints are used to reduce the search space before the objectives are minimised using a genetic algorithm.



Thursday 24 March 2000, 12:30pm, CS202J
 
 

A probabilistic exemplar-based model

Sunil Vadera
Department of Computer Science
University of Salford

Abstract:

A central problem in case based reasoning (CBR) is how to store and retrieve cases.

One approach to this problem is to use exemplar based models, where only the prototypical cases are stored. However, the development of an exemplar based model (EBM) requires the solution of several problems:

(i) how can a EBM be represented?
(ii) given a new case, how can a suitable exemplar be retrieved?
(iii) what makes a good exemplar?
(iv) how can an EBM be learned incrementally?

This seminar presents a new model, called a probabilistic exemplar based model, that addresses these questions. The model utilizes Bayesian networks to develop a suitable representation and uses probabilistic propagation for assessing and retrieving exemplars when a new case is presented. The model learns incrementally by revising the exemplars retained and by updating the conditional probabilities required by the Bayesian network. Results of evaluating the model on three datasets and relationships with other approaches will also be presented.

The seminar will begin with an introduction to CBR and Bayesian networks.

Speaker's homepage: www.salford.ac.uk/isrc/vadera



Friday, 05 May 2000, 12:30pm, CS202J
 
 

Representing and Executing Group Tasks in a Battlefield Simulation

Jeremy Baxter
DERA Malvern

Abstract:

Work on representing the behaviours of vehicles within a virtual reality battlefield simulation will be presented and the need for ways of representing and co-ordination group tasks explained. A framework, based on theories from Multi Agent Systems, for supporting group behaviours will be described. The framework allows group tasks to be recursively broken down into co-ordinated sub-group tasks and executed by vehicles within a simulation. The framework supports both the planning and execution of group tasks. The way the framework supports the detection and recovery from collective failures will be demonstrated. We believe that this type of framework is necessary to support the robust execution of multi-vehicle actions within an environment as dynamic as the simulated battlefield. Further work intended to investigate the effects of using the framework will be briefly discussed.



Wednesday, 10 May 2000, 2:00pm, CS103
 
 

Solving NP-Complete Problems: Non-Boolean Satisfiability

Alan M. Frisch
Department of Computer Science
University of York

 Abstract:

State-of-the-art solvers for some NP-complete problems have developed to the point where they can solve problem instances that are large enough to be of practical importance.  Using the Boolean satisfiability problem (SAT) as an example, the first part of this talk comprises a tutorial on two successful solvers and on methods for transforming certain other NP-complete problems to SAT.

Many of the problems on which satisfiability procedures have been effective are non-Boolean in that they are most naturally formulated in terms of variables with domain sizes greater than two.  Boolean satisfiability procedures have been applied to these non-Boolean problems by reformulating the problem with Boolean variables.  An alternative, previously unexplored approach, is to generalise satisfiability procedures to handle non-Boolean formulas directly. The second part of this talk presents our resent research that compares these two approaches.



Friday, 12 May 2000, 12:30, CS202J

TALK 1
-----------

Relative orientation of 2D points: a quantitative, constraint-based model
extending the qualitative Double-Cross calculus

Amar Isli
University of Hamburg

Abstract:

Models of Qualitative Spatial Reasoning (QSR), built according to the ``make only as many distinctions as necessary'' belief, partition the universe of values into a finite number of regions; elements lying in a same region are qualitatively equal, i.e., indistinguishable. The purpose of this work is not to tackle the foundational arguments of QSR, in which I strongly believe; rather, I believe as well that some application domains need a representational model based on a partition of the universe of values as fine-grained as possible: that is to say, a quantitative model. The talk presents such a model, which is a constraint-based model of ternary relations on triples of 2D points; a relation of the model describes the relative, quantitative, orientation of its three arguments.  Contrary to qualitative models, for which the reasoning task (involving, in the case of ternary relations, the operations of converse, rotation, intersection, and composition) is reduced mainly to table look-ups, the reasoning process of a quantitative system is committed to use effective procedures dealing with arithmetic operations (such as adding two real numbers) in order to compute any of the four algebraic operations applied to relations of the model.
 

TALK 2:
------------

Dynamic Flexible Constraint Satisfaction and its Application to AI Planning

Ian Miguel
University of Edinburgh

Abstract:

It has become increasingly clear that the original formulation of a static constraint satisfaction problem (CSP) with hard, imperative constraints is insufficient to model many real problems. These shortcomings have recently been addressed via separate extensions known as dynamic CSP and flexible CSP respectively, but little has been done to combine these approaches in order to bring to bear the benefits of both in solving more complex problems. A new integrated algorithm, Flexible Local Changes (FLC), will be presented for solving dynamic flexible constraint satisfaction problems (DFCSPs). The application of DFCSP techniques, in conjunction with focused constraint discovery, to AI Planning will then be discussed. Using flexible constraints as a foundation, classical AI Planning can be extended to incorporate preference-based flexible information. Plan synthesis in this framework can be reduced to solving a hierarchy of DFCSPs. Significant efficiency gains are then possible via the discovery of constraints which are used to guide the FLC algorithm and to effectively prune the search space. Experimental results will be presented, showing the utility of both FLC and the flexible planning algorithm.
 



Friday, 16 June 2000, 12:30, CS202J
 
 

Can Machine Learning techniques help with the Dual Priority Conjecture?

Alan Burns
Department of Computer Science
University of York

Abstract:

Within real-time scheduling the two classic results for an idealised set of processes are: the 100% utilisation bound for Earliest Deadline First (EDF) scheduling, and the 69% utilisation bound for Rate Monotonic (RM) scheduling. In 1995 I conjectured that Dual Priority (DP) scheduling, which is very similar to RM, can also achieve 100% utilisation. No convincing proof exists to this conjecture but neither does any counter example. As a large number of example can be produced (ie the integer parameters that lead to 100% utilisation) could some ML techniques help to find the proof?



Wednesday, 28 June 2000, 4pm, CS202J
 
 

Combinatorics of Refinement

Flaviu Marginean
Department of Computer Science
University of York

Abstract:

Starting with Shapiro's seminal work in the early eighties, refinement has proved pivotal in realising induction in a logic programming context. Classical unary refinement employs operators that make small steps through the subsumption lattice and are non-adaptive. Depending on the operator, this may result in one or more of the following shortcomings: slow convergence to the target, incomplete pruning, redundancy, severe predetermined restrictions on how the search could proceed. This talk will track my attempts to address these deficiencies. I will attempt to show that adaptive binary refinement has the potential to overcome all the four mentioned disadvantages. I will conclude with further and related work.
 



Tuesday, 18 July 2000, 12:30, CS202J

TALK 1:
------------

Searching the Subsumption Lattice by a Genetic Algorithm

Alireza Tamaddoni-Nezhad
Department of Computer Science
University of York

Abstract:

This talk aims to present a framework for combining Genetic Algorithms with ILP methods. This framework includes a novel binary representation for encoding first-order clauses and relevant genetic operators. It is shown that the proposed representation encodes a subsumption lattice in a complete and compact way. It is also shown that the proposed genetic operators are meaningful and can be interpreted in ILP terms such as lgg(least general generalization) and mgi(most general instance). These operators can be used to  explore a subsumption lattice efficiently by doing binary operations (e.g. and/or). An implementation of the proposed framework is used to combine Inverse Entailment of CProgol with a genetic search.

TALK 2:
------------

Ontology-based Constructive Induction

Daniel Kudenko
Department of Computer Science
University of York

Abstract:

To date most constructive induction methods rely on a set of pre-defined features in order to generate a new feature representation or extend the
existing one. Nevertheless, an inadequate existing feature set may not be significantly improved by feature generation, construction, or selection methods. We argue that going directly from domain expertise to a feature set is often not a natural process and may lead to an inadequate representation, and recommend the use ontologies as an intermediate step.


Friday, 6 October 2000, 12:45, CS202J

Stochastic Constraint Programming

Toby Walsh
Department of Computer Science
University of York

Abstract:

Many real world problems contain uncertainty. Data about events in the past may not be known exactly due to errors in measuring or difficulties in sampling, whilst data about events in the future may simply not be known with certainty.  To deal with such situations, I propose an extension of constraint programming called "stochastic constraint programming". In this framework, we distinguish between decision variables, which we are free to set, and stochastic (or observed) variables, which follow some probability distribution. Stochastic constraint programming shows promise for modelling a variety of decision problems involving uncertainity like production planning and staff rostering.


Wednesday, 18 October 2000, 2:00, CS103          (Departmental Seminar)

The Art of Modelling

Barbara Smith
Department of Computer Science
University of Leeds

Abstract:

A constraint satisfaction problem (CSP) consists of a set of variables each of which must be assigned one of a finite set of values, subject to constraints.  CSPs can represent a great variety of problems, and the transformation into a CSP is often more straightforward than in integer linear programming, for instance, since the constraints allowed are much more general. Thus, finding a correct model, in the sense that solutions to the CSP will give solutions to the original problem, may be easy.  However, finding an effective model, which will allow solutions to be found reasonably quickly, is often difficult.

Issues to be considered in modelling include:

Hence although constraint programming offers in principle a straightforward way of solving appropriate problems, in practice modelling remains an art.


Friday, 20 October 2000, 12:30, CS202J

Experiments with McCarthy's Appearance and Reality Problem

Stephen Muggleton
Department of Computer Science
University of York

Abstract:

In 1999 John McCarthy placed a challenge to the Machine Learning community on his web pages. The problem involves a circle of letters with associated
buttons that can be pushed. The challenge is to develop (or apply) a machine learning program which carries out a set of button pushes and then provides an explanatory model for the behaviours of the letters. A partially successful attempt on this problem using the Inductive Logic Programming system Progol will
be described.


Friday, 3 November 2000, 12:30, CS202J

NP chunking with transformation lists

Enrique Alfonseca
Department of Computer Science
University of York

Abstract:

NP chunking consists in detecting simple non-recursive noun phrases. This task can be useful, for example, as a preprocessing stage before a full parser, or for other tasks such as Information Extraction, Retrieval or Summarization. Several attempts have been proposed, none of which has obtained better results than around 93 percent accuracy. I will explain in this talk a possible solution for this task that attains similar levels of accuracy than previous approaches, and discuss possible ways to improve it.


Friday, 17 November 2000, 12:30, CS202J
 
 

Evolution and Learning in a Multi-Agent Environment

Dimitar Kazakov
Department of Computer Science
University of York

Abstract:

Motto for the seminar:
"The people building physical robots learned nothing." Marvin Minsky, as quoted in HAL's Legacy, MIT Press, 1996.
URL: http://mitpress.mit.edu/e-books/Hal/contents.html

Keywords: Multi-agent systems, learning, evolution, real-time and time-critical control

Part A: The seminar will present ongoing work in the development of a generic environment for the simulation of evolution and learning in a multi-agent setting. The environment contains objects of a suitable programming language (JAVA) representing members of several animal species. Each of these agents will have a physical projection on a 2D grid (also containing plants/food and obstacles), and a mind to store and manipulate information represented in logic (Progol). The agents
collect sensory experience through a range of senses, which are stored in the mind store for further processing ; the agents' effectors can respond to mental commands. There are stationary elements in the environment such as food and obstacles.

Once fully developed, the Alife environment could be used:

  1. to collect sensory data from several agents sharing the same environment;
  2. to simulate the evolution and social behaviour of agents collecting and processing sensory information in a changing environment in which several agents compete and/or collaborate for the use of limited food resources;
  3. as Progol has been specifically designed to allow symbolic communication between agents, the setting can be easily extended to allow for both physical interaction and communication between agents to be studied from an evolutionary perspective.

Issues related to the use of adaptive vs hard-wired behaviour in real-time systems will be raised and discussion encouraged.

Part B: The second part of the talk will present how the above mentioned enviroment is being used to assess the role of a hypothetically inherited feature (gene) promoting altruism between relatives as a factor for survival.

The approach will simulate an environment in which two species, A and B, are randomly seeking for food, competing for the same food resources. Food consumption will result in increased individual energy level. Falling below a certain energy level would mean death. An encounter of two individuals of the same species results in the creation of an offspring if (1) their energy levels are sufficient and (2) they meet each other's criteria based on present and past energy levels. The initial energy level of the offspring is subtracted from that of the parents.

Some of the individuals of species A will be carriers of a gene forcing them to share their energy with the individuals they meet in proportion to the degree of kinship (shared genes). For example, identical twins would always make their energy levels equal etc. The gene will be passed with a certain probability from parent to child.

Evaluation will assess the progressive change of the following two parameters:

  1. the ratio between individuals of species A and B;
  2. the proportion of gene carriers within species A.


Wednesday, 22 November 2000, 2:00, CS103

Distinguished Lecturer in Knowledge Representation and Reasoning

Understanding Complexity: Recent Developments and Directions

Bart Selman
Department of Computer Science
Cornell University

Abstract:

Recently, we have made considerable progress towards a better understanding of the nature of hard computational problems. In particular, by exploiting  connections between combinatorial search problems and models from statistical physics, we now have methods that enable a  much finer-grained characterization of computational complexity than the standard worst-case complexity measures. These findings provide insights into new algorithmic strategies based on randomization and distributed algorithm portfolios. I will survey the recent progress in this area and I will discuss implications for the design of more robust computational systems.



Wednesday, 6 December 2000, 12:30, CS202J
 

                        A Knowledge-Biased Approach to Information Agents

                                     Leon Sterling
                     Department of Computer Science and Software Engineering
                                 University of Melbourne
                                      Australia

Abstract:

The talk will consist of two halves.The first half will give an overview of research in the Intelligent Agents Lab at the
University of Melbourne. The second half will give an overview of specific attempts to build information agents in the
domains of sports scores, classified ads including flats and cars, and legal cases.



Friday, 8 December 2000, 12:30, CS202J

                 Knowledge Discovery in Hybrid Knowledge Representation Systems

                                   Stefan Schlobach
                              Department of Computer Science
                                  King's College, London

Abstract:

The Group of Logic and Computation at King's College London currently implements a hybrid knowledge representation
systems based on description logics. These systems are called hybrid because conceptual knowledge about properties of
classes of objects and world knowledge about individuals is separated in the so called T(erminological)- and
A(ssertional)-Box. The knowledge is represented using Description Logics which are extensions of propositional logics with
relations and modalities for these relations, where a well defined formal semantics is given and optimised decision tools are
available.

In the talk I will present a method based on logical algorithms (tableaux,resolution) for discovering terminological
information from assertions. I will show that our method has close similarities to the decision matrix based rough set data
mining method which was first presented by Skowron. In fact, we show that the matrix method presents a special case of our
method in the case of a very restricted logical language. The significance of our approach lies in the possibility to extend a
methods similar to common rough set data mining methods to much more expressive languages and in the inclusion of
expert knowledge into the data mining process.


Last updated on 10 March 2011