First Workshop of the
NICTA Technical Focus Group
on Optimization.

Optimization Day

Kensington Laboratory
223 Anzac Parade, Sydney

May 30, 2006


Index:

Overview ]

Schedule ]

Location ]

Registration ]

Organizer ]

Overview

In a highly competitive world, only the best is good enough. Best might be one of many things: the cheapest, quickest, safest, least wasteful, .... Optimization techniques are therefore useful to government and business alike. Optimization techniques cover the continuous (e.g. how can my robot arm paint this car as quickly as possible?) and the discrete (e.g. how many planes do I need to fly this schedule?).

Within NICTA, there are a wide range of programs and projects using and developing optimization techniques. This one day workshop will bring together many of the researchers involved, to facilitate the exchange of ideas, to encourage interaction, and to identify new areas for synergy. In addition, a few selected friends from outside will be invited to showcase to them a little of what is going on within NICTA in this area.

Back to top ]

Schedule

9.00-10.00. Invited talk. Prof. Mark Wallace (Monash and NICTA Fellow)
3 Applications of Optimization in Transportation (and 3 benefits of constraint programming)
Constraint programming offers a toolbox of problem-solving techniques that enables us to separate problem definition from problem solving, and to experiment with the best mixture of techniques for the problem at hand. This is very useful for (typical) applications where the business problem and process changes, even while the solution is under development. We discuss three applications - in logistics, transport scheduling, and emergency dispatch - and show how they were solved using a rapid application development methodology that goes hand-in-hand with constraint programming.

10.00-10.30. Rapid Stochastic Gradient Descent. Simon Guenter.
The incorporation of online learning capabilities into real-time computing systems has been hampered by a lack of efficient, scalable optimization algorithms for this purpose: second-order methods are too expensive for large, nonlinear models, conjugate gradient does not tolerate the noise inherent in online learning, and simple gradient descent, evolutionary algorithms, etc., are unacceptably slow to converge. We are addressing this problem by developing new ways to accelerate stochastic gradient descent, using second-order gradient information obtained through the efficient computation of Hessian-vector products.

10.30-11.00. Coffee.

11.00-11.30. The Cost of Search. Phil Kilby.
Whilst waiting for a systematic search procedure like branch and bound to finish, you might ask yourself a number of questions. Is my search procedure just about to come back with an answer, or has it taken a wrong turn? Should I go for coffee and expect to find the answer on my return? Is it worth leaving this to run overnight, or should I just quit as this search is unlikely ever to finish? To help answer such questions, we propose some new online methods for estimating the size of a backtracking search tree.

11.30-12.00. Optimization of Clustering Problems. Wayne Pullan.
Clustering problems occur in a number of domains. This talk describes a Population Based Search (PBS) heuristic which has been successfully applied to clustering problems in both computational chemistry (atomic clusters) and graphs (p-median problem). The PBS heuristic uses a genetic algorithm based metaheuristic, primarily based on cut and paste crossover operators, to generate new starting points for a deterministic / local search. For larger atomic clusters / p-median instances, PBS is able to effectively utilise any number of computer processors.

12.00-12.30. Convex Optimization in Machine Learning: Opportunities and Problems. Alex Smola.
I will give a brief overview of convex optimization problems in the area of regression, classification, and estimation using undirected graphical models. This allows us to provide an efficient unified framework for many estimation problems. Due to their size, the problems are often difficult to solve exactly. I will discuss methods for solving them approximately.

12.30-13.00. Layered Learning (L2). Bernhard Hengst.
Markov Decision Problems (MDPs) can model many optimization problems but in practice suffer from what Bellman has called the "curse of dimensionality". When problem domains are constrained or have structure, it may nevertheless be possible to decompose problems and still find good tractable solutions. In this talk I will explore the decomposition of MDPs into subtask hierarchies and discuss how this affects optimality. The L2 (layered learning) project is researching the possibility of autonomous machines modelling complex environments by building and learning subtask hierarchies layer by layer.

13.00-14.00. Lunch.

Back to top ]

Location

The workshop will take place in the Level 1 seminar room of NICTA's Kensington Neville Roach Laboratory (otherwise known as L5), 223 Anzac Parade, Sydney.

Registration

To participate in the Optimization Day, please email Toby Walsh (toby.walsh@nicta.com.au) as soon as possible. If you wish to give a talk, please send a title at the same time.

Organizer

Professor Toby Walsh, National ICT Australia and University of NSW.

Back to top ]


tw@cse.unsw.edu.au