home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Hacker Chronicles 2
/
HACKER2.BIN
/
1018.NEURON
< prev
next >
Wrap
Text File
|
1993-11-19
|
246KB
|
6,172 lines
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
8794; Tue, 09 Nov 93 16:23:43 EST
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Tue, 09 Nov 93 16:23:23 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
id AA03858; Tue, 9 Nov 93 16:03:59 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA05479; Tue, 9 Nov 93 15:17:29 EST
Posted-Date: Tue, 09 Nov 93 15:16:44 EST
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #12 (book, jobs, misc, benchmarks)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Tue, 09 Nov 93 15:16:44 EST
Message-Id: <5460.752876204@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Tuesday, 9 Nov 1993
Volume 12 : Issue 12
Today's Topics:
Santa Fe Time Series Competition book out
Hidden layer representations
Contact request
Post-doc at Purdue
Benchmarks - Summary of Responses
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Santa Fe Time Series Competition book out
From: weigend@sabai.cs.colorado.edu
Date: Fri, 22 Oct 93 01:37:55 -0700
Announcing book on the results of the Santa Fe Time Series Competition:
____________________________________________________________________
Title: TIME SERIES PREDICTION:
Forecasting the Future and Understanding the Past.
Editors: Andreas S. Weigend and Neil A. Gershenfeld
Publisher: Addison-Wesley, September 1993.
Paperback ISBN 0-201-62602-0 US$32.25 (672 pages)
Hardcover ISBN 0-201-62601-2 US$49.50 (672 pages)
The rest of this message gives some background,
ordering information, and the table of contents.
____________________________________________________________________
Most observational disciplines, such as physics, biology, and finance,
try to infer properties of an unfamiliar system from the analysis of a measured
time record of its behavior. There are many mature techniques associated with
traditional time series analysis. However, during the last decade, several new
and innovative approaches have emerged (such as neural networks and time-delay
embedding), promising insights not available with these standard methods.
Unfortunately, the realization of this promise has been difficult.
Adequate benchmarks have been lacking, and much of the literature has been
fragmentary and anecdotal.
This volume addresses these shortcomings by presenting the results of a
careful comparison of different methods for time series prediction and
characterization. This breadth and depth was achieved through the Santa Fe
Time Series Prediction and Analysis Competition, which brought together an
international group of time series experts from a wide variety of fields to
analyze data from the following common data sets:
- A physics laboratory experiment (NH3 laser)
- Physiological data from a patient with sleep apnea
- Tick-by-tick currency exchange rate data
- A computer-generated series designed specifically for the Competition
- Astrophysical data from a variable white dwarf star
- J. S. Bach's last (unfinished) fugue from "Die Kunst der Fuge."
In bringing together the results of this unique competition, this volume serves
as a much-needed survey of the latest techniques in time series analysis.
Andreas Weigend received his Ph.D. from Stanford University
and was a postdoc at Xerox PARC. He is Assistant Professor in
the Computer Science Department and at the Institute of
Cognitive Science at the University of Colorado at Boulder.
Neil Gershenfeld received his Ph.D. from Cornell University
and was a Junior Fellow at Harvard University. He is Assistant
Professor at the Media Lab at MIT.
____________________________________________________________________
Order it through your bookstore, or directly from the publisher by
- calling the Addison-Wesley Order Department at 1-800-358-4566,
- faxing 1-800-333-3328,
- emailing <marcuss@world.std.com>, or
- writing to Advanced Book Marketing
Addison-Wesley Publishing
One Jacob Way
Reading, MA 01867, USA.
VISA, Mastercard, and American Express and checks are accepted. When you
prepay by check, Addison-Wesley pays shipping and ha
ndling charges. If payment does not accompany your order, shipping charges will
be added to your invoice. Addison-Wesley is requir
ed to remit sales tax to the following states: AZ, AR, CA, CO, CT, FL, GA, IL,
IN, LA, ME, MA, MI, MN, NY, NC, OH, PA, RI, SD, TN, T
X, UT, VT, WA, WV, WI.
_____________________________________________________________________
TABLE OF CONTENTS
xv Preface
Andreas S. Weigend and Neil A. Gershenfeld
1 The Future of Time Series: Learning and Understanding
Neil A. Gershenfeld and Andreas S. Weigend
Section I. DESCRIPTION OF THE DATA SETS__________________________________
73 Lorenz-Like Chaos in NH3-FIR Lasers
Udo Huebner, Carl-Otto Weiss, Neal Broadus Abraham, and Dingyuan Tang
105 Multi-Channel Physiological Data: Description and Analysis
David R. Rigney, Ary L. Goldberger, Wendell C. Ocasio, Yuhei Ichimaru,
George B. Moody, and Roger G. Mark
131 Foreign Currency Dealing: A Brief Introduction
Jean Y. Lequarre
139 Whole Earth Telescope Observations of the White Dwarf Star (PG1159-035)
J. Christopher Clemens
151 Baroque Forecasting: On Completing J.S. Bach's Last Fugue
Matthew Dirst and Andreas S. Weigend
Section II. TIME SERIES PREDICTION________________________________________
175 Time Series Prediction by Using Delay Coordinate Embedding
Tim Sauer
195 Time Series Prediction by Using a Connectionist Network with Internal Delay
Lines
Eric A. Wan
219 Simple Architectures on Fast Machines: Practical Issues in Nonlinear Time
Series Prediction
Xiru Zhang and Jim Hutchinson
243 Neural Net Architectures for Temporal Sequence Processing
Michael C. Mozer
265 Forecasting Probability Densities by Using Hidden Markov Models with Mixed
States
Andrew M. Fraser and Alexis Dimitriadis
283 Time Series Prediction by Using the Method of Analogues
Eric J. Kostelich and Daniel P. Lathrop
297 Modeling Time Series by Using Multivariate Adaptive Regression Splines
(MARS)
P.A.W. Lewis, B.K. Ray, and J.G. Stevens
319 Visual Fitting and Extrapolation
George G. Lendaris and Andrew M. Fraser
323 Does a Meeting in Santa Fe Imply Chaos?
Leonard A. Smith
Section III. TIME SERIES ANALYSIS AND CHARACTERIZATION___________________
347 Exploring the Continuum Between Deterministic and Stochastic Modeling
Martin C. Casdagli and Andreas S. Weigend
367 Estimating Generalized Dimensions and Choosing Time Delays: A Fast
Algorithm
Fernando J. Pineda and John C. Sommerer
387 Identifying and Quantifying Chaos by Using Information-Theoretic
Functionals
Milan Palus
415 A Geometrical Statistic for Detecting Deterministic Dynamics
Daniel T. Kaplan
429 Detecting Nonlinearity in Data with Long Coherence Times
James Theiler, Paul S. Linsay, and David M. Rubin
457 Nonlinear Diagnostics and Simple Trading Rules for High-Frequency Foreign
Exchange Rates
Blake LeBaron
475 Noise Reduction by Local Reconstruction of the Dynamics
Holger Kantz
Section IV. PRACTICE AND PROMISE_________________________________________
493 Large-Scale Linear Methods for Interpolation, Realization, and
Reconstruction of Noisy, Irregularly Sampled Data
William H. Press and George B. Rybicki
513 Complex Dynamics in Physiology and Medicine
Leon Glass and Daniel T. Kaplan
529 Forecasting in Economics
Clive W.J. Granger
539 Finite-Dimensional Spatial Disorder: Description and Analysis
V.S. Afraimovich, M.I. Rabinovich, and A.L. Zheleznyak
557 Spatio-Temporal Patterns: Observations and Analysis
Harry L. Swinney
569 Appendix: Accessing the Server
571 Bibliography (800 references)
631 Index
------------------------------
Subject: Hidden layer representations
From: garry_k <G.Kearney@greenwich.ac.uk>
Date: Fri, 22 Oct 93 12:49:43 +0000
As a newcomer to NN I would value direction in the area of how we can
discover real world representations in the layers of the net. I would
appreciate some guidance as to reading matter on this. I understand
that pca and cluster analysis is involved usually. Is this the only
method? What applications derive from discovering these
representations? Thanks.
------------------------------
Subject: Contact request
From: Rowan Limb <rlimb@hfnet.bt.co.uk>
Date: Fri, 22 Oct 93 15:31:21 +0000
I have a copy of an abstract submitted to the International Neural Network
Conference held in Paris in July 1990 (INNC-90) but no full paper was
submitted. I would like further information on this submission and/or
a contact address (email if possible) for the authors. The details are:
Title: Associative Relational Database: Design and Implementation
Authors: Vladimir Cherkassky & Michael J Endrizzi
Dept. of Electrical Engineering & Dept. of Computer Science
University of Minnesota
Minneapolis, MN 55455 USA
Thanks in advance,
Rowan Limb
Decision Support Systems
BT Laboratories
Martlesham Heath
IPSWICH IP5 7RE
England
email: limb_p_r@bt-web.bt.co.uk
or: rlimb@hfnet.bt.co.uk
------------------------------
Subject: Post-doc at Purdue
From: Frank Doyle <fdoyle@ecn.purdue.edu>
Date: Fri, 22 Oct 93 12:12:30 -0600
Postdoctoral position available in :
NEURO-MODELING
in the Department of Chemical Engineering, Purdue University
Position for 2 years (beginning Fall 1993; salary: $25,000 per year).
Subject: Neuro-modeling of blood pressure control
This project is part of an interdisciplinary program involving
industrial and academic participants from DuPont, Purdue University,
the University of Pennsylvannia, and Louisiana State University. The
program encompasses the disciplines of chemical engineering, automatic
control, and neuroscience. Active interactions with engineers and
nonlinear control and modeling community at Purdue and DuPont as well
as with the neuroscientists at DuPont and Penn will be necessary for
the success of the project. A strong background in neuro-modeling is
required. The facilities at Purdue include state-of-the art
computational workstations (HP 735s and Sun 10/41s).
The postdoctoral candidate will work on the development of models of
the control mechanisms responsible for blood pressure regulation. The
neural system under investigation is the cardiorespiratory control
system, which integrates sensory information on respiratory and
cardiovascular variables to regulate and coordinate cardiac, vascular
and respiratory activity. In order to better understand this system our
program does neurobiolgical research and computational modeling. In
effect, these results reverse engineer neuronal and systems
function, which can have implications for engineering application; and
the engineering applications of our first interest are in chemical
engineering. The overall effort involves neurobiologists, chemical
engineers, computer scientists, bioengineers and neural systems
modelers. The present position is meant to contribute to the neural
systems modeling - chemical engineering interaction.
The neural computational-modeling work is progressing at several
levels: (1) systems-level modeling modeling of the closed-loop
cardiorespiratory system, (2) cellular level modeling of nonlinear
computation in Hodgkin-Huxley style neuron models, and (3) network
modeling of networks built-up from HH-style neurons incorporating
channel kinetics and synaptic conductances to capture the mechanisms in
the baroreceptor vagal reflex. The macroscopic model will be used (in
conjunction with experimental data from the literature and from the
laboratory of Dr. Schwaber) in developing structures to represent the
control functions. The synaptic level modeling activities will be used
in developing the building blocks which achieve the control function.
The present position will focus towards research goals, under the
supervision of Dr. Frank Doyle, that include the identification of
novel control and modeling techniques.
Interested candidates should send their curriculum vitae to BOTH:
Prof. Francis J. Doyle III
School of Chemical Engineering
Purdue University
West Lafayette, IN 47907-1283
(317) 497-9228
E-mail: fdoyle@ecn.purdue.edu
&
Dr. James Schwaber
Neural Computation Group
E.I. DuPont deNemours & Co., Inc.
P.O. Box 80352
Wilmington, DE 19880-0352
(302) 695-7136
E-mail: schwaber@eplrx7.es.duPont.com
------------------------------
Subject: Benchmarks - Summary of Responses
From: stjaffe@vaxsar.vassar.edu (steve jaffe)
Date: 22 Oct 93 14:18:47 -0500
Thanks to those who responded to my request for information on collections
of benchmarks with which to test and compare various nn architectures and
algorithms. Specific thanks to Nadine Tschichold-Guerman
<nadine@ifr.ethz.ch>, John Reynolds <reynolds@cns.bu.edu>, Tim Ross
<ross@toons.aar.wpafb.af.mil>, and Peter G. Raeth
<raethpg%wrdc.dnet@wl.wpafb.af.mil>. I list below their specific
recommendations along with others I have discovered.
Most correspondents mentioned the UCI database, and it would seem to be the
largest and best-known such collection. It is, not surprisingly, also
listed in the FAQ for comp.ai.neural-nets.
=====================================
1. From "FAQ for comp.ai.neural-nets":
written by: Lutz Prechelt (email: prechelt@ira.uka.de)
(Note: the current FAQ can be obtained by ftp from
rtfm.mit.edu. Look in the anonymous ftp directory "/pub/usenet/news.answers")
Question 19:-A19.) Databases for experimentation with NNs ?
[are there any more ?]
1. The nn-bench Benchmark collection
accessible via anonymous FTP on
"pt.cs.cmu.edu"
in directory
"/afs/cs/project/connect/bench"
or via the Andrew file system in the directory
"/afs/cs.cmu.edu/project/connect/bench"
In case of problems email contact is "nn-bench-request@cs.cmu.edu".
Data sets currently avaialable are:
nettalk Pronunciation of English words.
parity N-input parity.
protein Prediction of secondary structure of proteins.
sonar Classification of sonar signals.
two-spirals Distinction of a twin spiral pattern.
vowel Speaker independant recognition of vowels.
xor Traditional xor.
2. UCI machine learning database
accessible via anonymous FTP on
"ics.uci.edu" [128.195.1.1]
in directory
"/pub/machine-learning-databases"
3. NIST special databases of the National Institute Of Standards
And Technology:
NIST special database 2:
Structured Forms Reference Set (SFRS)
The NIST database of structured forms contains 5,590 full page images
of simulated tax forms completed using machine print. THERE IS NO REAL
TAX DATA IN THIS DATABASE. The structured forms used in this database
are 12 different forms from the 1988, IRS 1040 Package X. These
include Forms 1040, 2106, 2441, 4562, and 6251 together with Schedules
A, B, C, D, E, F and SE. Eight of these forms contain two pages or
form faces making a total of 20 form faces represented in the
database. Each image is stored in bi-level black and white raster
format. The images in this database appear to be real forms prepared
by individuals but the images have been automatically derived and
synthesized using a computer and contain no "real" tax data. The entry
field values on the forms have been automatically generated by a
computer in order to make the data available without the danger of
distributing privileged tax information. In addition to the images
the database includes 5,590 answer files, one for each image. Each
answer file contains an ASCII representation of the data found in the
entry fields on the corresponding image. Image format documentation
and example software are also provided. The uncompressed database
totals approximately 5.9 gigabytes of data.
NIST special database 3:
Binary Images of Handwritten Segmented Characters (HWSC)
Contains 313,389 isolated character images segmented from the
2,100 full-page images distributed with "NIST Special Database 1".
223,125 digits, 44,951 upper-case, and 45,313 lower-case character
images. Each character image has been centered in a separate
128 by 128 pixel region, error rate of the segmentation and
assigned classification is less than 0.1%.
The uncompressed database totals approximately 2.75 gigabytes of
image data and includes image format documentation and example software.
NIST special database 4:
8-Bit Gray Scale Images of Fingerprint Image Groups (FIGS)
The NIST database of fingerprint images contains 2000 8-bit gray scale
fingerprint image pairs. Each image is 512 by 512 pixels with 32 rows
of white space at the bottom and classified using one of the five
following classes: A=Arch, L=Left Loop, R=Right Loop, T=Tented Arch,
W=Whirl. The database is evenly distributed over each of the five
classifications with 400 fingerprint pairs from each class. The images
are compressed using a modified JPEG lossless compression algorithm
and require approximately 636 Megabytes of storage compressed and 1.1
Gigabytes uncompressed (1.6 : 1 compression ratio). The database also
includes format documentation and example software.
More short overview:
Special Database 1 - NIST Binary Images of Printed Digits, Alphas, and Text
Special Database 2 - NIST Structured Forms Reference Set of Binary Images
Special Database 3 - NIST Binary Images of Handwritten Segmented Characters
Special Database 4 - NIST 8-bit Gray Scale Images of Fingerprint Image Groups
Special Database 6 - NIST Structured Forms Reference Set 2 of Binary Images
Special Database 7 - NIST Test Data 1: Binary Images of Handprinted Segmented
Characters
Special Software 1 - NIST Scoring Package Release 1.0
Special Database 1 - $895.00
Special Database 2 - $250.00
Special Database 3 - $895.00
Special Database 4 - $250.00
Special Database 6 - $250.00
Special Database 7 - $1,000.00
Special Software 1 - $1,150.00
The system requirements for all databases are a 5.25" CD-ROM drive
with software to read ISO-9660 format.
Contact: Darrin L. Dimmick
dld@magi.ncsl.nist.gov (301)975-4147
If you wish to order the database, please contact:
Standard Reference Data
National Institute of Standards and Technology
221/A323
Gaithersburg, MD 20899
(301)975-2208 or (301)926-0416 (FAX)
4. CEDAR CD-ROM 1: Database of Handwritten
Cities, States, ZIP Codes, Digits, and Alphabetic Characters
The Center Of Excellence for Document Analysis and Recognition (CEDAR)
State University of New York at Buffalo announces the availability of
CEDAR CDROM 1: USPS Office of Advanced Technology
The database contains handwritten words and ZIP Codes
in high resolution grayscale (300 ppi 8-bit) as well as
binary handwritten digits and alphabetic characters (300 ppi
1-bit). This database is intended to encourage research in
off-line handwriting recognition by providing access to
handwriting samples digitized from envelopes in a working
post office.
Specifications of the database include:
+ 300 ppi 8-bit grayscale handwritten words (cities,
states, ZIP Codes)
o 5632 city words
o 4938 state words
o 9454 ZIP Codes
+ 300 ppi binary handwritten characters and digits:
o 27,837 mixed alphas and numerics segmented
from address blocks
o 21,179 digits segmented from ZIP Codes
+ every image supplied with a manually determined
truth value
+ extracted from live mail in a working U.S. Post
Office
+ word images in the test set supplied with dic-
tionaries of postal words that simulate partial
recognition of the corresponding ZIP Code.
+ digit images included in test set that simulate
automatic ZIP Code segmentation. Results on these
data can be projected to overall ZIP Code recogni-
tion performance.
+ image format documentation and software included
System requirements are a 5.25" CD-ROM drive with software to read ISO-
9660 format.
For any further information, including how to order the
database, please contact:
Jonathan J. Hull, Associate Director, CEDAR, 226 Bell Hall
State University of New York at Buffalo, Buffalo, NY 14260
hull@cs.buffalo.edu (email)
==========================================
2. From John Reynolds <reynolds@cns.bu.edu>:
We've come across several benchmarks which were proposed as standards
for categorizers. More information is available on each in a couple
of papers we wrote, which were printed in Neural Networks and IEEE
Transactions on Neural Networks.
The mushroom database was introduced by Schlimmer in 1987. The
task is to tell poisonous and non-poisonous musrhooms apart. There
are 8124 training patterns, of which about 50% are poisonous and 50%
are non-poisonous. The database is available by anonymous ftp, and
is described in:
Carpenter, G.A., Grossberg, S., and Reynolds, J., (1991). ARTMAP:
Supervised real-time learning and classification of nonstationary data
by a self-organizing neural network, {\sl Neural Networks}, {\bf 4},
565--588.
Three more benchmark problems, described briefly below, are detailed
in the following article:
Carpenter, G.A., Grossberg, S., Markuzon, N., Reynolds, J., and
Rosen,D., (1992). Fuzzy ARTMAP: A neural network architecture for
incremental supervised learning of analog multidimensional maps, {\em
IEEE Transactions on Neural Networks}, {\bf 3}, 698--713.
Frey and Slate developed a benchmark machine learning task in 1991 in
which a system has to identify an input exemplar as one of 26 capital
letters A--Z. The database was derived from 20,000 different binary
pixel images. There are a wide variety of letter types represented --
different stroke styles, letter styles, and random distortions. This
database is available from the UCI Repository of Machine Learning
Databases maintained by David repository@ics.uci.edu
Lang and Witbrock Alexis P. Wieland introduced the nested spirals
problem, and it has been used as a benchmark by Lang and Witbrock,
1989. The two spirals of the benchmark task each make three complete
turns in the plane, with 32 points per turn plus an endpoint,
totalling 97.
The circle in the square problem, which requires a system to identify
which points of a square lie inside and which lie outside a circle
whose area equals half that of the square, was specified as a
benchmark problem for system performance evaluation in the DARPA
Artificial Neural Network Technology (ANNT) Program (Wilensky, 1990).
Last time I checked, there are a variety of different learning tasks
in the UCI repository, and that would probably be worth looking into.
I hope that helps. Good luck with your search! -John
=================================================
3. From Tim Ross <ross@toons.aar.wpafb.af.mil>:
I'm sure you're aware of the uci machine learning database (ics.uci.edu)
and the logic synthesis benchmarks (gloster@mcnc.org) that are also used
as machine learning test cases.
We at Wright Lab use a set of 30 benchmark functions, each on 8 binary
inputs and a single binary output. These functions were selected for a
variety of type (numeric, symbolic, images, ...), complexity (measured by
decomposed function cardinality, an especially robust measure), and
number of minority elements (i.e. fraction of inputs whose output is
ONE). We have done/are doing experiments (using these benchmarks) with a
BP NN, Abductory Inference Mechanism, C4.5 and an in-house method. We
are also developing a similar set of benchmark functions on larger (esp.
12 and 16) numbers of input variables. We, of course, would be happy to
see these benchmarks used elsewhere.
=======================================
4. A set of examples comes with the distribution of the nn simulator
package "Aspirin/Migraines", available from two FTP sites:
CMU's simulator collection on "pt.cs.cmu.edu" (128.2.254.155)
in /afs/cs/project/connect/code/am6.tar.Z".
and UCLA's cognitive science machine "ftp.cognet.ucla.edu" (128.97.50.19)
in alexis/am6.tar.Z
These are the examples provided:
xor: from RumelHart and McClelland, et al,
"Parallel Distributed Processing, Vol 1: Foundations",
MIT Press, 1986, pp. 330-334.
encode: from RumelHart and McClelland, et al,
"Parallel Distributed Processing, Vol 1: Foundations",
MIT Press, 1986, pp. 335-339.
bayes: Approximating the optimal bayes decision surface
for a gauss-gauss problem.
detect: Detecting a sine wave in noise.
iris: The classic iris database.
characters: Learing to recognize 4 characters independent
of rotation.
ring: Autoregressive network learns a decaying sinusoid
impulse response.
sequence: Autoregressive network learns to recognize
a short sequence of orthonormal vectors.
sonar: from Gorman, R. P., and Sejnowski, T. J. (1988).
"Analysis of Hidden Units in a Layered Network Trained to
Classify Sonar Targets" in Neural Networks, Vol. 1, pp. 75-89.
spiral: from Kevin J. Lang and Michael J, Witbrock, "Learning
to Tell Two Spirals Apart", in Proceedings of the 1988 Connectionist
Models Summer School, Morgan Kaufmann, 1988.
ntalk: from Sejnowski, T.J., and Rosenberg, C.R. (1987).
"Parallel networks that learn to pronounce English text" in
Complex Systems, 1, 145-168.
perf: a large network used only for performance testing.
monk: The backprop part of the monk paper. The MONK's problem were
the basis of a first international comparison
of learning algorithms. The result of this comparison is summarized in
"The MONK's Problems - A Performance Comparison of Different Learning
algorithms" by S.B. Thrun, J. Bala, E. Bloedorn, I. Bratko, B.
Cestnik, J. Cheng, K. De Jong, S. Dzeroski, S.E. Fahlman, D. Fisher,
R. Hamann, K. Kaufman, S. Keller, I. Kononenko, J. Kreuziger, R.S.
Michalski, T. Mitchell, P. Pachowicz, Y. Reich H. Vafaie, W. Van de
Welde, W. Wenzel, J. Wnek, and J. Zhang has been published as
Technical Report CS-CMU-91-197, Carnegie Mellon University in Dec.
1991.
wine: From the ``UCI Repository Of Machine Learning Databases
and Domain Theories'' (ics.uci.edu: pub/machine-learning-databases).
------------------------------
End of Neuron Digest [Volume 12 Issue 12]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
2846; Thu, 11 Nov 93 20:36:48 EST
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Thu, 11 Nov 93 20:36:41 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
id AA25500; Thu, 11 Nov 93 20:21:41 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA13257; Thu, 11 Nov 93 19:35:22 EST
Posted-Date: Thu, 11 Nov 93 19:34:39 EST
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #13 (conference - NIPS pt 1)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Thu, 11 Nov 93 19:34:39 EST
Message-Id: <13235.753064479@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Thursday, 11 Nov 1993
Volume 12 : Issue 13
Today's Topics:
NIPS*93 workshops
NIPS*93 workshop
NIPS*93 program
NIPS Workshop: Selective Attention
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: NIPS*93 workshops
From: "Michael C. Mozer" <mozer@dendrite.cs.colorado.edu>
Date: Mon, 06 Sep 93 20:21:07 -0700
For the curious, a list of topics for the NIPS*93 post-conference workshops
is attached. The workshops will be held in Vail, Colorado, on December 3 and
4, 1993.
For further info concerning the individual workshops, please contact the
workshop organizers, whose names and e-mail are listed below. Abstracts are
not available at present, but will be distributed prior to the workshops.
For NIPS conference and workshop registration info, please write to: NIPS*93
Registration / NIPS Foundation / PO Box 60035 / Pasadena, CA 91116-6035 USA
- ----------------
December 3, 1993
- ----------------
Complexity Issues in Neural Computation and Learning
Vwani Roychowdhury & Kai-Yeung Siu
vwani@ecn.purdue.edu
Connectionism for Music and Audition
Andreas Weigend & Dick Duda
weigend@cs.colorado.edu
Memory-based Methods for Regression and Classification
Thomas Dietterich
tgd@cs.orst.edu
Neural Networks and Formal Grammars
Simon Lucas
sml@essex.ac.uk
Neurobiology, Psychophysics, and Computational Models of Visual Attention
Ernst Niebur & Bruno Olshausen
ernst@acquine.cns.caltech.edu
Robot Learning: Exploration and Continuous Domains
David Cohn
cohn@psyche.mit.edu
Stability and Observability
Max Garzon & F. Botelho
garzonm@maxpc.msci.memst.edu
VLSI Implementations
William O. Camp, Jr.
camp@owgvm6.vnet.ibm.com
What Does the Hippocampus Compute?
Mark Gluck & Bruce McNaughton
gluck@pavlov.rutgers.edu
- ----------------
December 4, 1993
- ----------------
Catastrophic Interference in Connectionist Networks: Can it be Predicted,
Can it be Prevented?
Bob French
french@willamette.edu
Connectionist Modeling and Parallel Architectures
Joachim Diederich & Ah Chung Tsoi
joachim@fitmail.fit.qut.edu.au
Dynamic Representation Issues in Connectionist Cognitive Modeling
Jordan Pollack
pollack@cis.ohio-state.edu
Functional Models of Selective Attention and Context Dependency
Thomas Hildebrandt
thildebr@aragorn.csee.lehigh.edu
Learning in Computer Vision and Image Understanding -- An Advantage over
Classical Techniques?
Hayit Greenspan
hayit@micro.caltech.edu
Memory-based Methods for Regression and Classification
Thomas Dietterich
tgd@cs.orst.edu
Neural Network Methods for Optimization Problems
Arun Jagota
jagota@cs.buffalo.edu
Processing of Visual and Auditory Space and its Modification by Experience
Josef Rauschecker
josef@helix.nih.gov
Putting it all Together: Methods for Combining Neural Networks
Michael Perrone
mpp@cns.brown.edu
- ---------------------------------------------------------
NOTE: The assignment of workshops to dates is tentative.
- ---------------------------------------------------------
------------------------------
Subject: NIPS*93 workshop
From: Arun Jagota <jagota@cs.Buffalo.EDU>
Date: Fri, 10 Sep 93 17:08:05 -0500
CALL FOR PARTICIPATION
NIPS*93 workshop on
Neural Network Methods for Optimization Problems
There are 4-5 slots remaining for brief oral presentations of 20-30 minutes
each. To be considered, submit either (i) a title and one page abstract or
(ii) a bibliography of recent work on the topic.
Please submit materials by electronic mail to Arun Jagota
(jagota@cs.buffalo.edu) by October 5. Later submissions risk not having
remaining open slots.
Program:
- -------
Ever since the work of Hopfield and Tank, neural networks have found
increasing use for the approximate solution of hard optimization problems.
The successes in the past have however been limited, when compared to
traditional methods. In this workshop we will discuss the state of the art
of neural network algorithms for optimization, examine their weaknesses and
strengths, and discuss potential for improvement. Second, as the algorithms
arise from different areas (e.g. some from statistical physics, others from
computer science) we hope that researchers from these disciplines will share
their own insights with others. Third, we also hope to discuss theoretical
issues that arise in using neural network algorithms for optimization.
Finally, we hope to have people to discuss parallel implementation issues
or case studies.
- ---------------------
Arun Jagota
------------------------------
Subject: NIPS*93 program
From: Bartlett Mel <mel@cns.caltech.edu>
Date: Mon, 04 Oct 93 14:41:11 -0800
NIPS*93 MEETING PROGRAM and REGISTRATION REMINDER
The 1993 Neural Information Processing Systems (NIPS*93) meeting is
the seventh meeting of an inter-disciplinary conference which brings
together neuroscientists, engineers, computer scientists, cognitive
scientists, physicists, and mathematicians interested in all aspects
of neural processing and computation. There will be an afternoon of
tutorial presentations (Nov. 29), two and a half days of regular
meeting sessions (Nov. 30 - Dec. 2), and two days of focused workshops
at a nearby ski area (Dec. 3-4).
An electronic copy of the 1993 NIPS registration brochure is available
in postscript format via anonymous ftp at helper.systems.caltech.edu
in /pub/nips/NIPS_93_brochure.ps.Z. For a hardcopy of the brochure or
other information, please send a request to nips93@systems.caltech.edu
or to: NIPS Foundation, P.O. Box 60035, Pasadena, CA 91116-6035.
EARLY REGISTRATION DEADLINE (for $100 discount): Oct. 30
_________________
NIPS*93 ORAL PRESENTATIONS PROGRAM
Tues. AM: Cognitive Science
8:30 Invited Talk: Jeff Elman, UC San Diego:
From Weared to Wore: A Connectionist Account of the
History of the Past Tense
9:00 Richard O. Duda, San Jose State Univ.:
Connectionist Models for Auditory Scene Analysis
9:20 Reza Shadmehr and Ferdinando A. Mussa-Ivaldi, MIT:
Computational Elements of the Adaptive Controller of the Human Arm
9:40 Catherine Stevens and Janet Wiles, University of Queensland:
Tonal Music as a Componential Code: Learning Temporal Relationships
Between and Within Pitch and Timing Components
10:00 Poster Spotlights:
Thea B. Ghiselli-Crispa and Paul Munro, Univ. of Pittsburgh:
Emergence of Global Structure from Local Associations
Tony A. Plate, University of Toronto:
Estimating Structural Similarity by Vector Dot Products of
Holographic Reduced Representations
10:10 BREAK
Speech Recognition
10:40 Jose C. Principe, Hui-H. Hsu and Jyh-M. Kuo, Univ. of Florida:
Analysis of Short Term Neural Memory Structures for Nonlinear Prediction
11:00 Eric I. Chang and Richard P. Lippmann, MIT Lincoln Laboratory:
Figure of Merit Training for Detection and Spotting
11:20 Gregory J. Wolff, K. Venkatesh Prasad, David G. Stork and
Marcus Hennecke, Ricoh California Research Center:
Lipreading by Neural Networks: Visual Preprocessing, Learning and
Sensory Integration
11:40 Poster Spotlights:
Steve Renals, Mike Hochberg and Tony Robinson, Cambridge University:
Learning Temporal Dependencies In Large-Scale Connectionist
Speech Recognition
Ying Zhao, John Makhoul, Richard Schwartz and George
Zavaliagkos, BBN Systems and Technologies:
Segmental Neural Net Optimization for Continuous Speech Recognition
11:50 Rod Goodman, Caltech: Posner Memorial Lecture
Tues. PM: Temporal Prediction and Control
2:00 Invited Talk: Doyne Farmer, Prediction Co.:
Time Series Analysis of Nonlinear and Chaotic Time Series: State Space
Reconstruction and the Curse of Dimensionality
2:30 Kenneth M. Buckland and Peter D. Lawrence, Univ. of British Columbia:
Transition Point Dynamic Programming
2:50 Gary W. Flake, Guo-Zhen Sun, Yee-Chun Lee and Hsing-Hen Chen,
University of Maryland:
Exploiting Chaos to Control The Future
3:10 Satinder P. Singh, Andrew G. Barto, Roderic Grupen and
Christopher Connolly, University of Massachusetts:
Robust Reinforcement Learning in Motion Planning
3:30 BREAK
Theoretical Analysis
4:00 Scott Kirkpatrick, Naftali Tishby, Lidror Troyansky,
The Hebrew Univ. of Jerusalem, and Geza Gyorgi, Eotvos Univ.:
The Statistical Mechanics of K-Satisfaction
4:20 Santosh S. Venkatesh, Changfeng Wang, Univ. of Pennsylvania,
and Stephen Judd, Siemens Corporate Research:
When To Stop: On Optimal Stopping And Effective Machine Size In Learning
4:40 Wolfgang Maass, Technische Univ. Graz:
Agnostic PAC-Learning Functions on Analog Neural Nets
5:00 H.N. Mhaskar, California State Univ. and Charles A. Micchelli, IBM:
How To Choose An Activation Function
5:20 Poster Spotlights
Iris Ginzburg, Tel Aviv Univ. and Haim Sompolinsky, Hebrew Univ.:
Correlation Functions on a Large Stochastic Neural Network
Xin Wang, Qingnan Li and Edward K. Blum, USC:
Asynchronous Dynamics of Continuous-Time Neural Networks
Tal Grossman and Alan Lapedes, Los Alamos National Laboratory:
Use of Bad Training Data for Better Predictions
Wed. AM: Learning Algorithms
8:30 Invited Talk: Geoff Hinton, Univ. of Toronto:
Using the Minimum Description Length Principle to Discover Factorial
Codes
9:00 Richard S. Zemel, Salk Institute, and G. Hinton, Univ. of Toronto:
Developing Population Codes By Minimizing Description Length
9:20 Sreerupa Das and Michael C. Mozer, University of Colorado:
A Hybrid Gradient-Descent/Clustering Technique for Finite State
Machine Induction
9:40 Eric Saund, Xerox Palo Alto Research Center:
Unsupervised Learning of Mixtures of Multiple Causes in Binary Data
10:00 BREAK
10:30 A. Uzi Levin and Todd Leen, Oregon Graduate Institute:
Fast Pruning Using Principal Components
10:50 Christoph Bregler and Stephen Omohundro, ICSI:
Surface Learning with Applications to Lip Reading
11:10 Melanie Mitchell, Santa Fe Inst. and John H. Holland, Univ. Michigan:
When Will a Genetic Algorithm Outperform Hill Climbing
11:30 Oded Maron and Andrew W. Moore, MIT:
Hoeffding Races: Accelerating Model Selection Search for Classification
and Function Approximation
11:50 Poster Spotlights:
Zoubin Ghahramani and Michael I. Jordan, MIT:
Supervised Learning from Incomplete Data via an EM Approach
Mats Osterberg and Reiner Lenz, Linkoping Univ.
Unsupervised Parallel Feature Extraction from First Principles
Terence D. Sanger, LAC-USC Medical Center:
Two Iterative Algorithms for Computing the Singular Value Decomposition
from Input/Output Samples
Patrice Y. Simard and Edi Sackinger, AT&T Bell Laboratories:
Efficient Computation of Complex Distance Metrics Using Hierarchical
Filtering
Wed. PM: Neuroscience
2:00 Invited Talk: Eve Marder, Brandeis Univ.:
Dynamic Modulation of Neurons and Networks
2:30 Ojvind Bernander, Rodney Douglas and Christof Koch, Caltech:
Amplifying and Linearizing Apical Synaptic Inputs to
Cortical Pyramidal Cells
2:50 Christiane Linster and David Marsan, ESPCI,
Claudine Masson and Michel Kerzberg, CNRS:
Odor Processing in the Bee: a Preliminary Study of the
Role of Central Input to the Antennal Lobe
3:10 M.G. Maltenfort, R. E. Druzinsky, C. J. Heckman and
W. Z. Rymer, Northwestern Univ.:
Lower Boundaries of Motoneuron Desynchronization Via
Renshaw Interneurons
3:30 BREAK
Visual Processing
4:00 K. Obermayer, The Salk Institute, L. Kiorpes, NYU and
Gary G. Blasdel, Harvard Medical School:
Development of Orientation and Ocular Dominance Columns
in Infant Macaques
4:20 Yoshua Bengio, Yann Le Cun and Donnie Henderson, AT&T Bell Labs:
Globally Trained Handwritten Word Recognizer using Spatial
Representation, Spatial Displacement Neural Networks and
Hidden Markov Models
4:40 Trevor Darrell and A. P. Pentland, MIT:
Classification of Hand Gestures using a View-based
Distributed Representation
5:00 Ko Sakai and Leif H. Finkel, Univ. of Pennsylvania:
A Network Mechanism for the Determination of Shape-from-Texture
5:20 Video Poster Spotlights (to be announced)
Thurs. AM: Implementations and Applications
8:30 Invited Talk: Dan Seligson, Intel:
A Radial Basis Function Classifier with On-chip Learning
9:00 Michael A. Glover, Current Technology, Inc. and
W. Thomas Miller III, University of New Hampshire:
A Massively-Parallel SIMD Processor for Neural Network and
Machine Vision Application
9:20 Steven S. Watkins, Paul M. Chau, and Mark Plutowski, UCSD,
Raoul Tawel and Bjorn Lambrigsten, JPL:
A Hybrid Radial Basis Function Neurocomputer
9:40 Gert Cauwenberghs, Caltech :
A Learning Analog Neural Network Chip with Continuous-Time
Recurrent Dynamics
10:00 BREAK
10:30 Invited Talk: Paul Refenes, University College London:
Neural Network Applications in the Capital Markets
11:00 Jane Bromley, Isabelle Guyon, Yann Le Cun, Eduard Sackinger
and Roopak Shah, AT&T Bell Laboratories:
Signature Verification using a "Siamese" Time Delay Neural Network
11:20 John Platt and Ralph Wolf, Synaptics, Inc.:
Postal Address Block Location Using a Convolutional Locator Network
11:40 Shumeet Baluja and Dean Pomerleau, Carnegie Mellon University:
Non-Intrusive Gaze Tracking Using Artificial Neural Networks
12:00 Adjourn to Vail for Workshops
_____________________
NIPS*93 POSTER PROGRAM
Tues. PM Posters:
Cognitive Science (CS)
CS-1 Blasig Using Backpropagation to Automatically Generate Symbolic
Classification Rules
CS-2 Munro, Ghiselli-Crispa Emergence of Global Structure from Local
Associations
CS-3 Plate Estimating structural similarity by vector dot products of
Holographic Reduced Representations
CS-4 Shultz, Elman Analyzing Cross Connected Networks
CS-5 Sperduti Encoding of Labeled Graphs by Labeling RAAM
Speech Processing (SP)
SP-1 Farrell, Mammone Speaker Recognition Using Neural Tree Networks
SP-2 Hirayama, Vatikiotis-Bateson, Kawato Inverse Dynamics of Speech Motor
Control
SP-3 Renals, Hochberg, Robinson Learning Temporal Dependencies In Large-Scale
Connectionist Speech Recognition
SP-4 Zhao, Makhoul, Schwartz, Zavaliagkos Segmental Neural Net
Optimization for Continuous Speech Recognition
Control, Navigation and Planning (CT)
CT-1 Atkeson Using Local Trajectory Optimizers To Speed Up Global
Optimization In Dynamic Programming
CT-2 Boyan, Littman A Reinforcement Learning Scheme for Packet Routing Using
a Network of Neural Networks
CT-3 Cohn Queries and Exploration Using Optimal Experiment Design
CT-4 Duff, Barto Monte Carlo Matrix Inversion and Reinforcement Learning
CT-5 Gullapalli, Barto Convergence of Indirect Adaptive Asynchronous Dynamic
Programming Algorithms
CT-6 Jaakkola, Jordan, Singh Stochastic Convergence Of Iterative DP
Algorithms
CT-7 Moore The Parti-game Algorithm for Variable Resolution Reinforcement
Learning in Multidimensional State-spaces
CT-8 Nowlan, Cacciatore Mixtures of Controllers for Jump Linear and Non-linear
Plants
CT-9 Wada, Koike, Vatikiotis-Bateson, Kawato A Computational Model for
Cursive Handwriting Based on the Minimization Principle
Learning Theory, Generalization and Complexity (LT)
LT-01 Cortes, Jackel, Solla, Vapnik, Denker Learning Curves: Asymptotic
Values and Rates of Convergence
LT-02 Fefferman Recovering A Feed-Forward Net From Its Output
LT-03 Grossman, Lapedes Use of Bad Training Data for Better Predictions
LT-04 Hassibi, Sayed, Kailath H-inf Optimality Criteria for LMS and
Backpropagation
LT-05 Hush, Horne Bounds on the complexity of recurrent neural network
implementations of finite state machines
LT-06 Ji A Bound on Generalization Error Using
Network-Parameter-Dependent Information and Its Applications
LT-07 Kowalczyk Counting function theorem for multi-layer networks
LT-08 Mangasarian, Solodov Backpropagation Convergence Via Deterministic
Nonmonotone Perturbed Minimization
LT-09 Plutowski, White Delete-1 Cross-Validation Estimates IMSE
LT-10 Schwarze, Hertz Discontinuous Generalization in Large Commitee Machines
LT-11 Shapiro, Prugel-Bennett Non-Linear Statistical Analysis and
Self-Organizing Competitive Networks
LT-12 Wahba Structured Machine Learning for 'Soft' Classification, with
Smoothing Spline ANOVA Models and Stacked Tuning, Testin
g
and Evaluation
LT-13 Watanabe Solvable models of artificial neural networks
LT-14 Wiklicky On the Non-Existence of a Universal Learning Algorithm for
Recurrent Neural Networks
Dynamics/Statistical Analysis (DS)
DS-1 Coolen, Penney, Sherrington Coupled Dynamics of Fast Neurons and
Slow Interactions
DS-2 Garzon, Botelho Observability of neural network behavior
DS-3 Gerstner, van Hemmen How to Describe Neuronal Activity: Spikes,
Rates, or Assemblies?
DS-4 Ginzburg, Sompolinsky Correlation Functions on a Large Stochastic
Neural Network
DS-5 Leen, Orr Momentum and Optimal Stochastic Search
DS-6 Ruppin, Meilijson Optimal signalling in Attractor Neural Networks
DS-7 Wang, Li, Blum Asynchronous Dynamics of Continuous-Time Neural Networks
Recurrent Networks (RN)
RN-1 Baird, Troyer, Eeckman Grammatical Inference by Attentional Control of
Synchronization in an Oscillating Elman Net
RN-2 Bengio, Frasconi Credit Assignment through Time: Alternatives to
Backpropagation
RN-3 Kolen Fool's Gold: Extracting Finite State Machines From Recurrent
Network Dynamics
RN-4 Movellan A Reinforcement Algorithm to Learn Trajectories with Stochastic
Neural Networks
RN-5 Saunders, Angeline, Pollack Structural and behavioral evolution of
recurrent networks
Applications (AP)
AP-01 Baldi, Brunak, Chauvin, Krogh Hidden Markov Models in Molecular
Biology: Parsing the Human Genome
AP-02 Eeckman, Buhmann, Lades A Silicon Retina for Face Recognition
AP-03 Flann A Hierarchal Approach to Recognizing On-line Cursive Handwriting
AP-04 Graf, Cosatto, Ting Locating Address Blocks with a Neural Net System
AP-05 Karunanithi Identifying Fault-Prone Software Modules Using
Feed-Forward Networks: A Case Study
AP-06 Keymeulen Comparison Training for a Rescheduling Problem in Neural
Networks
AP-07 Lapedes, Steeg Use of Adaptive Networks to Find Highly Predictable
Protein Structure Classes
AP-08 Schraudolph, Dayan, Sejnowski Using the TD(lambda) Algorithm to Learn
an Evaluation Funcion for the Game of Go
AP-09 Smyth Probabilistic Anomaly Detection in Dynamic Systems
AP-10 Tishby, Singer Decoding Cursive Scripts
Wed. PM posters:
Learning Algorithms (LA)
LA-01 Gold, Mjolsness Clustering with a Domain-Specific Distance Metric
LA-02 Buhmann Central and Pairwise Data Clustering by Competitive Neural
Networks
LA-03 de Sa Learning Classification without Labeled Data
LA-04 Ghahramani, Jordan Supervised learning from incomplete data via an
EM approach
LA-05 Tresp, Ahmad, Neuneier Training Neural Networks with Deficient Data
LA-06 Osterberg, Lenz Unsupervised Parallel Feature Extraction from First
Principles
LA-07 Sanger Two Iterative Algorithms for Computing the Singular Value
Decomposition from Input/Output Samples
LA-08 Leen, Kambhatla Fast Non-Linear Dimension Reduction
LA-09 Schaal, Atkeson Assessing The Quality of Learned Local Models
LA-10 Simard, Sackinger Efficient computation of complex distance metrics using
hierarchical filtering
LA-11 Tishby, Ron, Singer The Power of Amnesia
LA-12 Wettscherek, Dietterich Locally Adaptive Nearest Neighbor Algorithms
LA-13 Liu Robust Parameter Estimation and Model Selection for Neural
Network Regression
LA-14 Wolpert Bayesian Backpropagation Over Functions Rather Than Weights
LA-15 Thodberg Bayesian Backprop in Action: Pruning, Ensembles, Error Bars and
Application to Strectroscopy
LA-16 Dietterich, Jain, Lanthop Dynamic Reposing for Drug Activity Prediction
LA-17 Ginzburg, Horn Combined Neural Networks For Time Series Analysis
LA-18 Graf, Simard Backpropagation without Multiplication
LA-19 Harget, Bostock A Comparative Study of the Performance of a Modified
Bumptree with Radial Basis Function Networks and the
Standard Multi-Layer Perceptron
LA-20 Najafi, Cherkassky Adaptive Knot Placement Based on Estimated
Second Derivative of Regression Surface
Constructive/Pruning Algorithms (CP)
CP-1 Fritzke Supervised Learning with Growing Cell Structures
CP-2 Hassibi, Stork, Wolff Optimal Brain Surgeon: Extensions, streamlining
and performance comparisons
CP-3 Kamimura Generation of Internal Representations by alpha-transformation
CP-4 Leerink, Jabri Constructive Learning Using Internal Representation
Conflicts
CP-5 Utans Learning in Compositional Hierarchies: Inducing the Structure of
Objects from Data
CP-6 Watanabe An Optimization Method of Layered Neural Networks Based on the
Modified Information Criterion
Neuroscience (NS)
NS-01 Bialek, Ruderman Statistics of Natural Images: Scaling in the Woods
NS-02 Boussard, Vibert Dopaminergic neuromodulation brings a dynamical
plasticiy to the retina
NS-03 Doya, Selverston, Rowat A Hodgkin-Huxley Type Neuron Model that Learns
Slow Non-Spike Oscillation
NS-04 Gusik, Eaton Directional Hearing by the Mauthner System
NS-05 Horiuchi, Bishofberger, Koch Building an Analog VLSI, Saccadic Eye
Movement System
NS-06 Lewicki Bayesian Modeling and Classification of Neural Signals
NS-07 Montague, Dayan, Sejnowski Foraging in an Uncertain Environment
Using Predictive Hebbian Learning
NS-08 Rosen, Rumelhart, Knudsen A Connectionist Model of the Owl's Sound
Localization System
NS-09 Sanger Optimal Unsupervised Motor Learning Predicts the Internal
Representation of Barn Owl Head Movements
NS-10 Siegal An Analog VLSI Model Of Central Pattern Generation In The
Medicinal Leech
NS-11 Usher, Stemmler, Koch High spike rate variability as a consequence of
network amplification of local fluctuations
Visual Processing (VP)
VP-1 Ahmad Feature Densities are Required for Computing Feature
Corresponces
VP-2 Buracas, Albright Proposed function of MT neurons' receptive field
surrounds: computing shapes of objects from velocity fields
VP-3 Geiger, Diamantaras Resolving motion ambiguities
VP-4 Mjolsness Two-Dimensional Object Localization by Coarse-to-fine
Correlation Matching
VP-5 Sajda, Finkel Dual Mechanisms for Neural Binding and Segmentation and
Their Role in Cortical Integration
VP-6 Yuille, Smirnakis, Xu Bayesian Self-Organization
Implementations (IM)
IM-01 Andreou, Edwards VLSI Phase Locking Architecture for Feature Linking in
Multiple Target Tracking Systems
IM-02 Coggins, Jabri WATTLE: A Trainable Gain Analogue VLSI Neural Network
IM-03 Elfadel, Wyatt The "Softmax" Nonlinearity: Derivation Using Statistical
Mechanics and Useful Properties as a Multiterminal
Analog Circuit Element
IM-04 Muller, Kocheisen, Gunzinger High Performance Neural Net Simulation
on a Multiprocessor System with "Intelligent"
Communication
IM-05 Murray, Burr, Stork, et al. Digital Boltzmann VLSI for constraint
satisfaction and learning
IM-06 Niebur, Brettle Efficient Simulation of Biological Neural Networks on
Massively Parallel Supercomputers with Hypercube
Architecture
IM-07 Oliveira, Sangiovanni-Vincentelli Learning Complex Boolean Functions:
Algorithms and Applications
IM-08 Shibata, Kotani, Yamashita et al. Implementing Intelligence on Silicon
Using Neuron-Like Functional MOS Transistors
IM-09 Watts Event-Driven Simulation of Networks of Spiking Neurons
------------------------------
Subject: NIPS Workshop: Selective Attention
From: Thomas Hildebrandt <thildebr@aragorn.csee.lehigh.edu>
Date: Tue, 05 Oct 93 14:08:11 -0500
I wish to call your attention to a workshop on selective attention
which I will be hosting at this year's NIPS conference.
===================================================================
NIPS*93 Postconference Workshop
Functional Models of Selective Attention and Context Dependency
December 4, 1993
Intended Audience: Those applying NNs to vision and speech analysis
and pattern recognition tasks, as well as computational
neurobiologists modelling attentional mechanisms.
Organizer: Thomas H. Hildebrandt
thildebr@athos.eecs.lehigh.edu
ABSTRACT: Classification based on trainable models still fails to
achieve the current ideal of human-like performance. One identifiable
reason for this failure is the disparity between the number of
training examples needed to achieve good performance (large) and the
number of labelled samples available for training (small). On certain
tasks, humans are able to generalize well when given only one
exemplar. Clearly, a different mechanism is at work.
In human behavior, there are numerous examples of selective attention
improving a person's recognition capabilities. Models using context
or selective attention seek to improve classification performance by
modifying the behavior of a classifier based on the current (and
possibly recent) input data. Because they treat learning and
contextual adaptation as two different processes, these models solve
the memory/plasticity dilemma by incorporating both. In other words,
they differ fundamentally from models which attempt to provide
contextual adaptation by allowing all the weights in the network to
continue evolving while the system is in operation.
Schedule December 4, 1993
======== ================
7:30 - 7:35 Opening Remarks
7:35 - 8:00 Current Research in Selective Attention
Thomas H. Hildebrandt, Lehigh University
8:00 - 8:30 Context-varying Preferences and Traits in a Class of
Neural Networks
Daniel S. Levine, University of Texas at Arlington
Samuel J. Leven, For a New Social Science
8:30 - 9:00 ETS - A Formal Model of an Evolving Learning Machine
L.Goldfarb, J.Abela, V.Kamat, University of New Brunswick
9:00 - 9:30 Recognizing Handwritten Digits Using a Selective
Attention Mechanism
Ethem Alpaydin, Bogazici University, Istanbul TURKEY
9:30 - 4:30 FREE TIME
4:30 - 5:00 Context and Selective Attention in the Capital Markets
P. N. Refenes, London Business School
5:00 - 5:30 The Global Context-Sensitive Constraint Satisfaction Property
in Adaptive Perceptual Pattern Recognition
Jonathan A. Marshall, University of North Carolina
5:30 - 6:00 Neural Networks for Context Sensitive Representation
of Synonymous and Homonymic Patterns
Albert Nigrin, American University
6:00 - 6:30 Learn to Pay Attention, Young Network!
Barak A. Pearlmutter, Siemens Corp. Research Ctr.,
Princeton NJ
6:30 - 6:35 Closing Remarks
7:00 Workshop Wrap-Up (common to all sessions)
=====================================================================
The topic to be covered differs from that recently announced by Ernst
Niebur and Bruno Olshausen, in that "functional" models are not
necessarily tied to neurophysiological structures. Thanks to the
Workshop Chair, Mike Mozer, the two workshops were scheduled on
different days, so that it is possible for interested parties to
attend both.
An electronic copy of the 1993 NIPS registration brochure is available
in postscript format via anonymous ftp at helper.systems.caltech.edu
in /pub/nips/NIPS_93_brochure.ps.Z. For a hardcopy of the brochure or
other information, please send a request to nips93@systems.caltech.edu
or to: NIPS Foundation, P.O. Box 60035, Pasadena, CA 91116-6035.
Feel free to contact me for more information on the workshop.
Thomas H. Hildebrandt
Electrical Engineering & Computer Science
Lehigh University
Bethlehem, PA 18015
Work: (215) 758-4063
FAX: (215) 758-6279
thildebr@athos.eecs.lehigh.edu
------------------------------
End of Neuron Digest [Volume 12 Issue 13]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
3170; Thu, 11 Nov 93 23:06:23 EST
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Thu, 11 Nov 93 23:06:19 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
id AA01572; Thu, 11 Nov 93 22:52:45 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA15352; Thu, 11 Nov 93 22:18:05 EST
Posted-Date: Thu, 11 Nov 93 22:17:00 EST
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #14 (conference NIPS pt 2)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Thu, 11 Nov 93 22:17:00 EST
Message-Id: <15343.753074220@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Thursday, 11 Nov 1993
Volume 12 : Issue 14
Today's Topics:
information on NIPS*93 workshop accommodations
Music and Audition at NIPS (1st day at Vail)
NIPS*93 Hybrid Systems Workshop
NIPS-93 Workshop on catastrophic interference
NIPS Workshop on Spatial Perception
Post-NIPS Workshop on Robot Learning
NIPS*93 Workshop on Stability and Observability/program
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: information on NIPS*93 workshop accommodations
From: "Michael C. Mozer" <mozer@dendrite.cs.colorado.edu>
Date: Tue, 19 Oct 93 12:40:14 -0700
The NIPS*93 brochure is a bit sketchy concerning accommodations at the
NIPS workshops, to be held at the Radisson Resort Vail December 2-4.
To make reservations at the Radisson, call (800) 648-0720. For general
information on the resort, the central number is (303) 476-4444. Reservations
can also be made by fax: (303) 476-1647. And if you would like to let the
glowing power of a live psychic answer your very personal questions, the
number is (900) 820-7131.
Note that rooms will be held for us only until the beginning of November, and
last year many participants had to sleep in the snow due to lack of foresight
in making reservations.
Concerning lift tickets: Unfortunately, the NIPS brochure was published
before we were able to obtain this year's lift ticket prices. The prices have
increased roughly $5/day over those published in the brochure. If you wish
to advance purchase tickets, though, we ask that you send in the amounts
published in the brochure. We will collect the difference on site. (Sorry,
it's the only feasible way to do recordkeeping at this point.) Lift tickets
may also be purchased on site at an additional expense of roughly $1/day.
Very sorry for the inconvenience.
Mike Mozer
NIPS*93 Workshop Chair
------------------------------
Subject: Music and Audition at NIPS (1st day at Vail)
From: "E. large" <large@cis.ohio-state.edu>
Date: Mon, 25 Oct 93 09:43:53 -0500
Resonance and the Perception of Musical Meter
Edward W. Large and John F. Kolen
The perception of musical rhythm is traditionally described as
involving, among other things, the assignment of metrical structure to
rhythmic patterns. In our view, the perception of metrical structure
is best described as a dynamic process in which the temporal
organization of musical events entrains the listener in much the same
way that two pendulum clocks hanging on the same wall synchronize
their motions so that they tick in lock step. In this talk, we
re-assess the notion of musical meter, and show how the perception of
this sort of temporal organization can be modeled as a system of
non-linearly coupled oscillators responding to musical rhythms.
Individual oscillators phase- and frequency- lock to components of
rhythmic patterns, embodying the notion of musical pulse, or beat. The
collective behavior of a system of oscillators represents a
self-organized response to rhythmic patterns, embodying a "perception"
of metrical structure. When exposed to performed musical rhythms the
system shows the ability to simultaneously perform quantization
(categorization of temporal intervals), and assignment of metrical
structure in real time. We discuss implications for psychological
theories of temporal expectancy, "categorical" perception of temporal
intervals, and the perception of metrical structure.
------------------------------
Subject: NIPS*93 Hybrid Systems Workshop
From: "Michael P. Perrone" <mpp@cns.brown.edu>
Date: Mon, 25 Oct 93 10:23:54 -0500
=====================================================================
NIPS*93 Postconference Workshop
December 4, 1993
Pulling It All Together: Methods for Combining Neural Networks
=====================================================================
Intended Audience:
Those interested in optimization algorithms for improving
neural network performance.
Organizer:
Michael P. Perrone, Brown University (mpp@cns.brown.edu)
Abstract:
This workshop will examine current hybrid methods for improving
neural network performance.
The past several years have seen a tremendous growth in the complexity of
the recognition, estimation and control tasks expected of neural networks.
In solving these tasks, one is faced with a large variety of learning
algorithms and a vast selection of possible network architectures. After
all the training, how does one know which is the best network? This
decision is further complicated by the fact that, even though standard
techniques such as MLPs and RBF nets are theoretically sufficient for
solving any task, they can be severely limited by problems such as over-
fitting, data sparsity and local optima.
The usual solution to these problems is a winner-take-all cross-validatory
model selection. However, recent experimental and theoretical work
indicates that we can improve performance by considering methods for
combining neural networks.
This workshop will discuss several such methods including Boosting,
Competing Experts, Ensemble Averaging, the GENSEP algorithm, Metropolis
algorithms, Stacked Generalization and Stacked Regression. The issues we
will cover in this workshop include Bayesian considerations, the role of
complexity, the role of cross-validation integrating a priori knowledge,
error orthogonality, task decomposition, network selection techniques,
overfitting, data sparsity and local optima.
Lively audience participation is encouraged.
Schedule December 4, 1993
======== ================
7:30-7:35 Opening Remarks
7:35-7:55 M. Perrone, (Brown University)
"Averaging Methods: Theoretical Issues and Real World Examples"
7:55-8:15 J. Friedman, (Stanford)
"A New Approach to Multiple Outputs Using Stacking"
8:15-8:35 S. Nowlan, (Salk Institute)
"Competing Experts"
8:35-8:55 H. Drucker, (AT&T)
"Boosting Compared to Other Ensemble Methods"
8:55-9:30+ Discussion
9:30-4:30 FREE TIME
4:30-4:50 C. Scofield, (Nestor Inc)
"Commercial Applications: Modular Approaches to Real World Tasks"
4:50-5:10 W. Buntine, (NASA Ames Research Center)
"Averaging and Probabilistic Networks: Automating the Process"
5:10-5:30 D. Wolpert, (Santa Fe Institute)
"Inferring a Function vs. Inferring an Inference Algorithm"
5:30-5:50 H. Thodberg, (Danish Meat Research Institute)
"Error Bars on Predictions from Deviations among Committee
Members (within Bayesian Backprop)"
5:50-6:10 S. Hashem, (Purdue University)
"Merits of Combining Neural Networks: Potential Benefits and Risks"
6:10-6:30+ Discussion & Closing Remarks
7:00 Workshop Wrap-Up (common to all sessions)
=====================================================================
General NIPS information and registration:
An electronic copy of the 1993 NIPS registration brochure is available
in postscript format via anonymous ftp at helper.systems.caltech.edu
in /pub/nips/NIPS_93_brochure.ps.Z.
For a hardcopy of the brochure or other information, please send a
request to nips93@systems.caltech.edu or to:
NIPS Foundation,
P.O. Box 60035,
Pasadena, CA 91116-6035
- -----------------------------------------------------------------
Michael P. Perrone Email: mpp@cns.brown.edu
Institute for Brain and Neural Systems Tel: 401-863-3920
Brown University Fax: 401-863-3934
Providence, RI 02912
- -----------------------------------------------------------------
------------------------------
Subject: NIPS-93 Workshop on catastrophic interference
From: Bob French <french@willamette.edu>
Date: Mon, 25 Oct 93 09:07:29 -0800
NIPS-93 Workshop:
================
CATASTROPHIC INTERFERENCE IN CONNECTIONIST NETWORKS:
CAN IT BE PREDICTED, CAN IT BE PREVENTED?
Date: Saturday, December 4, 1993, at Vail, Colorado
====
Intended audience: Connectionists, cognitive scientists and
================= applications-oriented users of connectionist
networks interested in a better understanding
of:
i) when and why their networks can suddenly
and completely forget previously learned
information;
ii) how it is possible to reduce or even
eliminate this phenomenon.
Organizer: Bob French
========= Computer Science Department
Willamette University, Salem OR
french@willamette.edu
Program:
========
When connectionist networks learn new information, they can
suddenly and completely forget everything they had previously learned.
This problem is called catastrophic forgetting or catastrophic
interference. Given the demonstrated severity of the problem, it is
intriguing to note that this problem has to date received very little
attention. When new information must be added to an already-trained
connectionist network, it is currently taken for granted that the
network will simply cycle through all of the old data again. Since
relearning all of the old data is both psychologically implausible as
well as impractical for very large data sets, is it possible to do
otherwise? Can connectionist networks be developed that do not forget
catastrophically -- or perhaps that do not forget at all -- in the
presence of new information? Or is catastrophic forgetting perhaps
the inevitable price for using fully distributed representations?
Under what circumstances will a network forget or not forget?
Further, can the amount of forgetting be predicted with any
reliability? These questions are of particular interest to anyone who
intends to use connectionist networks as a memory/generalization
device.
This workshop will focus on:
- the theoretical reasons for catastrophic interference;
- the techniques that have been developed to eliminate
it or to reduce its severity;
- the side-effects of catastrophic interference;
- the degree to which a priori prediction of catastrophic
forgetting is or is not possible.
As connectionist networks become more and more a part of
applications packages, the problem of catastrophic interference will
have to be addressed. This workshop will bring the audience up to
date on current research on catastrophic interference.
Speakers: Stephan Lewandowsky (lewan@constellation.ecn.uoknor.edu)
======== Department of Psychology
University of Oklahoma
Phil A. Hetherington (het@blaise.psych.mcgill.ca)
Department of Psychology
McGill University
Noel Sharkey (noel@dcs.exeter.ac.uk)
Connection Science Laboratory
Dept. of Computer Science
University of Exeter, U.K.
Bob French (french@willamette.edu)
Computer Science Department
Willamette University
Morning session:
- ---------------
7:30 - 7:45 Bob French: An Introduction to the Problem of
Catastrophic Interference in Connectionist Networks
7:45 - 8:15 Stephan Lewandowsky: Catastrophic Interference: Causes,
Solutions, and Side-Effects
8:15 - 8:30 Brief discussion
8:30 - 9:00 Phil Hetherington: Sequential Learning in Connectionist
Networks: A Problem for Whom?
9:00 - 9:30 General discussion
Afternoon session
- -----------------
4:30 - 5:00 Noel Sharkey: Catastrophic Interference and
Discrimination.
5:00 - 5:15 Brief discussion
5:15 - 5:45 Bob French: Prototype Biasing and the
Problem of Prediction
5:45 - 6:30 General discussion and closing remarks
Below are the abstracts for the talks to be presented in this workshop:
CATASTROPHIC INTERFERENCE: CAUSES, SOLUTIONS, AND SIDE-EFFECTS
Stephan Lewandowsky
Department of Psychology
University of Oklahoma
I briefly review the causes for catastrophic interference in
connectionist models and summarize some existing solutions. I then
focus on possible trade-offs between resolutions to catastrophic
interference and other desirable network properties. For example, it
has been suggested that reduced interference might impair
generalization or prototype formation. I suggest that these
trade-offs occur only if interference is reduced by altering the
response surfaces of hidden units.
- --------------------------------------------------------------------------
SEQUENTIAL LEARNING IN CONNECTIONIST NETWORKS: A PROBLEM FOR WHOM?
Phil A. Hetherington
Department of Psychology
McGill University
Training networks in a strictly blocked, sequential manner normally
results in poor performance because new items overlap with old items
at the hidden unit layer. However, catastrophic interference is not a
necessary consequence of using distributed representations. First,
examination by the method of savings demonstrates that much of the
early information is still retained: Items thought lost can be
relearned within a couple of trials. Second, when items are learned
in a windowed, or overlapped fashion, less interference obtains. And
third, when items are presented in a strictly blocked, sequential
manner to a network that already possesses a relevant knowledge base,
interference may not occur at all. Thus, when modeling normal human
learning there is no catastrophic interference problem. Nor is there
a problem when modeling strictly sequential human memory experiments
with a network that has a relevant knowledge base. There is only a
problem when simple, unstructured, tabula rasa networks are expected
to model the intricacies of human memory.
- --------------------------------------------------------------------------
CATASTROPHIC INTERFERENCE AND DISCRIMINATION
Noel Sharkey
Connection Science Laboratory
Dept. of Computer Science
University of Exeter
Exeter, U.K.
Connectionist learning techniques, such as backpropagation, have
been used increasingly for modelling psychological phenomena.
However, a number of recent simulation studies have shown that when a
connectionist net is trained, using backpropagation, to memorize sets
of items in sequence and without negative exemplars, newly learned
information seriously interferes with old. Three converging methods
were employed to show why and under what circumstances such
retroactive interference arises. First, a geometrical analysis
technique, derived from perceptron research, was introduced and
employed to determine the computational and representational
properties of feedforward nets with one and two layers of weights.
This analysis showed that the elimination of interference always
resulted in a breakdown of old-new discrimination. Second, a formally
guaranteed solution to the problems of interference and discrimination
was presented as the HARM model and used to assess the relative merits
of other proposed solutions. Third, two simulation studies were
reported that assessed the effects of providing nets with experience
of the experimental task. Prior knowledge of the encoding task was
provided to the nets either by block training them in advance or by
allowing them to extract the knowledge through sequential training.
The overall conclusion was that the interference and discrimination
problems are closely related. Sequentially trained nets employing the
backpropagation learning algorithm will unavoidably suffer from either
one or the other.
- --------------------------------------------------------------------------
PROTOTYPE BIASING IN CONNECTIONIST NETWORKS
Bob French
Computer Science Dept.
Willamette University
Previously learned representations bias new representations. If
subjects are told that a newly encountered object X belongs to an
already familiar category P, they will tend to emphasize in their
representation of X features of the prototype they have for the
category P. This is the basis of prototype biasing, a technique that
appears to significantly reduce the effects catastrophic forgetting.
The 1984 Congressional Voting Records database is used to
illustrate prototype biasing. This database contains the yes-no
voting records of Republican and Democratic members of Congress in
1984 on 16 separate issues. This database lends itself conveniently
to the use of a network having 16 "yes-no" input units, a hidden layer
and one "Republican/Democrat" output node. A "Republican" prototype
and a "Democrat" prototype are built, essentially by separately
averaging over Republican and Democrat hidden-layer representations.
These prototypes then "bias" subsequent representations of new
Democrats towards the Democrat prototype and of new Republicans
towards the Republican prototype.
Prototypes are learned by a second, separate backpropagation
network that associates teacher patterns with their respective
prototypes. Thus, ideally, when the "Republican" teacher pattern is
fed into it, it produces the "Republican" prototype on output. The
output from this network is continually fed back to the hidden layer
of the primary network and is used to bias new representations.
Also discussed in this paper are the problems involved in
predicting the severity of catastrophic forgetting.
------------------------------
Subject: NIPS Workshop on Spatial Perception
From: Terry Sejnowski <terry@helmholtz.sdsc.edu>
Date: Mon, 25 Oct 93 21:15:43 -0800
NIPS*93 WORKSHOP ANNOUNCEMENT
Title:
Processing of visual and auditory space and its modification by experience.
Intended Audience:
Researchers interested in spatial perception, sensory fusion and learning.
Organizers:
Josef P. Rauschecker Terrence Sejnowski
josef@helix.nih.gov terry@helmholtz.sdsc.edu
Program:
This workshop will address the question how spatial information is
represented in the brain, how it is matched and compared by the visual and
auditory systems, and how early sensory experience influences the
development of these space representations.
We will discuss neurophysiological and computational data from cats,
monkeys, and owls that suggest how the convergence of different sensory
space representations may be handled by the brain. In particular, we will
look at the role of early experience and learning in establishing these
representations. Lack of visual experience affects space processing in cats
and owls differently. We will therefore discuss various kinds of plasticity
in different spatial representations.
Half the available time has been reserved for discussion and informal
presentations. We will encourage lively audience participation.
Morning Session
(7:30 - 8:30) Presentations
Predictive Hebbian learning and sensory fusion
(Terry Sejnowski)
A connectionist model of the owl's sound localization system
(Dan Rosen)
Intermodal compensatory plasticity of sensory systems
(Josef Rauschecker)
8:30 - 9:30 Discussion
Afternoon Session
(4:30 - 5:30) Presentations
Neurophysiological processing of visual and auditory space in monkeys
(Richard Andersen)
Learning map registration in the superior colliculus with predictive
Hebbian learning
(Alex Pouget)
A neural network model for the detection of heading direction from optic
flow in the cat's visual system
(Markus Lappe)
5:30 - 6:30 Discussion
=====================================================================
General NIPS information and registration:
An electronic copy of the 1993 NIPS registration brochure is available
in postscript format via anonymous ftp at helper.systems.caltech.edu
in /pub/nips/NIPS_93_brochure.ps.Z.
For a hardcopy of the brochure or other information, please send a
request to nips93@systems.caltech.edu or to:
NIPS Foundation,
P.O. Box 60035,
Pasadena, CA 91116-6035
=====================================================================
------------------------------
Subject: Post-NIPS Workshop on Robot Learning
From: David Cohn <cohn@psyche.mit.edu>
Date: Tue, 26 Oct 93 13:53:43 -0500
The following workshop will be held on Friday, December 3rd in Vail,
CO as one of the Post-NIPS workshops. To be added to a mailing list
for further information about the workshop, send electronic mail to
"robot-learning-request@psyche.mit.edu".
- ---------------------------------------------------------------------------
NIPS*93 Workshop: Robot Learning II: Exploration and Continuous Domains
=================
Intended Audience: Researchers interested in robot learning, exploration,
================== and active learning systems in general
Organizer: David Cohn (cohn@psyche.mit.edu)
========== Dept. of Brain and Cognitive Sciences
Massachusetts Institute of Technology
Cambridge, MA 02139
Overview:
=========
The goal of this workshop will be to provide a forum for researchers
active in the area of robot learning and related fields. Due to the
limited time available, we will focus on two major issues: efficient
exploration of a learner's state space, and learning in continuous
domains.
Robot learning is characterized by sensor noise, control error,
dynamically changing environments and the opportunity for learning by
experimentation. A number of approaches, such as Q-learning, have
shown great practical utility learning under these difficult
conditions. However, these approaches have only been proven to
converge to a solution if all states of a system are visited
infinitely often. What has yet to be determined is whether we can
efficiently explore a state space so that we can learn without having
to visit every state an infinite number of times, and how we are to
address problems on continuous domains, where there are effectively an
infinite number of states to be visited.
This workshop is intended to serve as a followup to last year's
post-NIPS workshop on machine learning. The two problems to be
addressed this year were identified as two (of the many) crucial
issues facing robot learning.
The morning session of the workshop will consist of short
presentations discussing theoretical approaches to exploration and to
learning in continuous domains, followed by general discussion guided
by a moderator. The afternoon session will center on practical and/or
heuristic approaches to these problems in the same format. As time
permits, we may also attempt to create an updated "Where do we go from
here?" list, like that drawn up in last year's workshop.
Video demos will be encouraged. If feasible, we will attempt to have a
VCR set up after the workshop to allow for informal demos.
Preparatory readings from the presenters will be ready by early
November. To be placed on a list to receive continuing information about
the workshop (such as where and when the readings appear on-line), send
email to "robot-learning-request@psyche.mit.edu".
Tentative Program:
==================
December 3, 1993
Morning Session: Theoretical Approaches
- ---------------------------------------
7:30-8:30 Andrew Moore, CMU
"The Parti-game approach to exploration"
synopses of
different Leemon Baird, USAF
approaches "Reinforcement learning in continuous domains"
(20 min each)
Juergen Schmidhuber, TUM
Reinforcement-directed information acquisition in
Markov Environments
8:30-9:30 Open discussion
Afternoon Session: Heuristic Approaches
- ---------------------------------------
4:30-5:50 Long-Ji Lin, Siemens
"RatBot: A mail-delivery robot"
synopses of
different Stephan Schaal, MIT
approaches "Efficiently exploring high-dimensional spaces"
(20 min each)
Terry Sanger, MIT/JPL
"Trajectory extension learning"
Jeff Schneider, Rochester
"Learning robot skills in high-dimensional action spaces"
5:50-6:30 Open discussion
------------------------------
Subject: NIPS*93 Workshop on Stability and Observability/program
From: GARZONM@hermes.msci.memst.edu
Date: 27 Oct 93 15:29:32 -0600
A day at NIPS*93 on
STABILITY AND OBSERVABILITY
3 December 1993 at Vail, Colorado
Intended Audience: nneuroscientists, computer and cognitive
================= scientists, neurobiologists, mathematicians/
dynamical systems, electrical engineers, and
anyone interested in questions such as:
* what effects can noise, bounded precision and
uncertainty in inputs, weights and/or
transfer functions have on the i/o
behavior of a neural network?
* what is missed and what is observable in
computer simulations of the networks they
purport to simulate?
* how much architecture can be observed
in the behavior of a network-in-a-box?
* what can be done to improve and/or accelerate
convergence to stable equilibria during
learning and network updates while preserving
the intended dynamics of the process?
Organizers:
==========
Fernanda Botelho Max Garzon
botelhof@hermes.msci.memst.edu garzonm@hermes.msci.memst.edu
Mathematical Sciences Institute for Intelligent Systems
Memphis State University
Memphis, TN 38152 U.S.A.
[botelhof,garzonm]@hermes.msci.memst.edu
Program:
=======
Following is a (virtually) final schedule. Each talk is scheduled for
15 minutes with 5 minutes of interim for questions and comments. One
or contributed talk might still be added to the schedule (and
will cut into the panel discussion in the afternoon).
Morning Session:
- ---------------
7:30-7:50 M. Garzon, Memphis State University, Tennessee
Introduction and Overview
7:50-8:10 S. Kak, Louisiana State University, Baton Rouge
Stability and Observability in Feedback Networks
8:10-8:30 S. Piche, Microelectronics Technology Co., Austin, Texas
Sensitivity of Neural Networks to Errors
8:30-8:50 R. Rojas, Int. Computer Science Institute UCB
and Freie Universit\"at Berlin
Stability of Learning in Neural Networks
8:50-9:10 G. Chauvet and P. Chauvet,
Institut de Biologie Th\'eorique, U. d'Angers, France
Stability of Purkinje Units in the Cerebellar Cortex
9:10-9:30 N. Peterfreund and Y. Baram, Technion, Israel
Trajectory Control of Convergent Networks
Afternoon Session:
- ------------------
4:30-4:50 X. Wang, U. of Southern California and UCLA
Consistencies of Stability and Bifurcation
4:50-5:10 M. Casey, UCSD, San Diego, California
Computation Dynamics in Discrete-time Recurrent Nets
5:10-5:30 M. Cohen, Boston University, Massachussets
Synthesis of Decision Regions in Dynamical Systems
5:30-5:50 F. Botelho, Memphis State University, Tennessee
Observability of Discrete and Analog Networks
5:50-6:10 U. Levin and K. Narendra,
OGI/CSE Portland/Oregon and Yale University,
Recursive Identification Using Feedforward Nets
6:10-6:30 Panel Discussion
7:00 All-workshop wrap-up
Max Garzon (preferred) garzonm@hermes.msci.memst.edu
Math Sciences garzonm@memstvx1.memst.edu
Memphis State University Phone: (901) 678-3138/-2482
Memphis, TN 38152 USA Fax: (901) 678-2480/3299
------------------------------
End of Neuron Digest [Volume 12 Issue 14]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
1660; Sun, 14 Nov 93 17:58:24 EST
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Sun, 14 Nov 93 17:58:18 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
id AA26421; Sun, 14 Nov 93 17:57:47 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA07337; Sun, 14 Nov 93 17:14:37 EST
Posted-Date: Sun, 14 Nov 93 17:13:54 EST
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #15 (books, requests/queries, jobs)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Sun, 14 Nov 93 17:13:54 EST
Message-Id: <7326.753315234@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Sunday, 14 Nov 1993
Volume 12 : Issue 15
Today's Topics:
Book on connectionist modeling of commonsense reasoning
PhD program at Univ of Alabama
Cogneuro society announcement
Some informtion required..thanks in advance..
Free Convolution Software (Educational Tool)
Combinatorial - Graph information
ANNs in Medicine and Signal processing?
new address and 2 questions
Thanking for replies
NN Parameters input/output control
PostDoctoral Fellowship.
NEUROSCIENCE/BIOMEDICAL ENGINEERING FACULTY POSITION BU
Postdoc Position Available
Position available
C Programmer Wanted
Post-doc position at LBL Human Genome Center
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Book on connectionist modeling of commonsense reasoning
From: rsun@athos.cs.ua.edu (Ron Sun)
Date: Fri, 22 Oct 93 14:49:47 -0600
A book (monograph) on connectionist models is available now from John
Wiley and Sons, Inc.
===================================================================
Integrating Rules and Connectionism for Robust Commonsense Reasoning
by: Ron Sun
Assistant Professor
Department of Computer Science
The University of Alabama
Tuscaloosa, AL 35487
===================================================================
Anyone interested in the book should contact John Wiley and Sons, Inc.
at 1-800-call-wil
- ------------------------------------------------------------------
A brief description is as follows:
One of the more difficult problems for artificial intelligence research
is the problem of modeling commonsense reasoning. Traditional models
have great difficulties in capturing the flexible and robust nature of
commonsense reasoning. This book attempts to tackle this problem by
adopting innovative approaches. In a nutshell, it is concerned with
understanding and modeling commonsense reasoning with a combination of
rules and similarities, under a connectionist rubric.
The book surveys the areas of reasoning, connectionist models, inheritance,
causality, rule-based systems, and similarity-based reasoning; it introduces
a new framework and a novel connectionist architecture for modeling
commonsense reasoning that synthesizes some of these areas.
Along with this framework, a set of interrelated new ideas regarding modeling
commonsense reasoning is discussed in the book, which are very relevant to the
current artificial intelligence and cognitive science research
and the on-going methodological debate.
- ------------------------------------------------------------------
Table of Content
Foreword (by David Waltz, NEC Research Institute)
Preface
1 Introduction
1.1 Overview
1.2 Commonsense Reasoning
1.3 The Problem of Common Reasoning Patterns
1.4 What is the Point?
1.5 Some Clarifications
1.6 The Organization
1.7 Summary
2 Accounting for Commonsense Reasoning: A Framework with Rules and Similarities
2.1 Overview
2.2 Examples of Reasoning
2.3 Patterns of Reasoning
2.4 Brittleness of Rule-Based Reasoning
2.5 Towards a Solution
2.6 Some Reflections on Rules and Connectionism
2.7 Summary
3 A Connectionist Architecture for Commonsense Reasoning
3.1 Overview
3.2 A Generic Architecture
3.3 Fine-Tuning --- from Constraints to Specifications
3.4 Summary
3.5 Appendix
4 Evaluations and Experiments
4.1 Overview
4.2 Accounting for the Reasoning Examples
4.3 Evaluations of the Architecture
4.4 Systematic Experiments
4.5 Choice, Focus and Context
4.6 Reasoning with Geographical Knowledge
4.7 Applications to Other Domains
4.8 Summary
4.9 Appendix: Determining Similarities and CD representations
5 More on the Architecture: Logic and Causality
5.1 Overview
5.2 Causality in General
5.3 Shoham's Causal Theory
5.4 Defining FEL
5.5 Accounting for Commonsense Causal Reasoning
5.6 Determining Weights
5.7 Summary
5.8 Appendix: Proofs For Theorems
6 More on the Architecture: Beyond Logic
6.1 Overview
6.2 Further Analysis of Inheritance
6.3 Analysis of Interaction in Representation
6.4 Knowledge Acquisition, Learning, and Adaptation
6.5 Summary
7 An Extension: Variables and Bindings
7.1 Overview
7.2 The Variable Binding Problem
7.3 First-Order FEL
7.4 Representing Variables
7.5 A Formal Treatment
7.6 Dealing with Difficult Issues
7.7 Compilation
7.8 Correctness
7.9 Summary
7.10 Appendix
8 Reviews and Comparisons
8.1 Overview
8.2 Rule-Based Reasoning
8.3 Case-Based Reasoning
8.4 Connectionism
8.5 Summary
9 Conclusions
9.1 Overview
9.2 Some Accomplishments
9.3 Lessons Learned
9.4 Existing Limitations
9.5 Future Directions
9.6 Summary
References
- ---------------------------------------------------------------------
------------------------------
Subject: PhD program at Univ of Alabama
From: rsun@athos.cs.ua.edu (Ron Sun)
Date: Fri, 22 Oct 93 14:52:25 -0600
The Ph.D program in Computer Science at the University of Alabama is now
accepting applications for 1994. Graduate assistantships and other forms
of financial support for graduate students are available. Prospective
graduate students interested in AI, neural networks, and other related
areas are especially encouraged to apply.
The Department of Computer Science at UA has identified graduate
education and research as its primary missions. The department is
conducting high-quality research in a number of areas:
Artificial Intelligence
connectionist models and neural networks
knowledge representation and common sense reasoning
cognitive modeling
learning
fuzzy logic and expert systems
Algorithms
graph algorithms and parallel computation
Database Systems
real-time databases and information mining
Human Interfaces
Software Engineering
object-oriented development, verification and validation
For detailed information and the graduate brochure, contact:
Graduate Program Admission
Department of Computer Science
The Univeristy of Alabama
Tuscaloosa, AL 35487
csdept@cs.ua.edu
- -----------------------------------------------------------------
The Computer Science Department was established in 1978 within the College
of Engineering. Degrees are awarded at the Bachelor's, Master's and Ph.D.
The program is accredited by the Computer Science Accreditation Board.
The Department has access to a wide range of computing
facilities, including the Department's own network of workstations.
The University is a member of the Alabama Supercomputer Network, with access
to the Cray X-MP/24.
Funding is available from the department, the University, and a wide
range of external sources, including the National Science Foundation,
DARPA, the US Air Force, the US Army, and so on.
The University of Alabama is a comprehensive research university
enrolling some 19,000 students. It is one of the oldest state
universities in the nation and recently celebrated its sesquicentennial.
The University offers programs in many areas, ranging from
the sciences and engineering to business administration, education and
the fine arts.
- -----------------------------------------------------------------
Dr. Ron Sun
Department of Computer Science phone: (205) 348-6363
The University of Alabama fax: (205) 348-8573
Tuscaloosa, AL 35487 rsun@cs.ua.edu
------------------------------
Subject: Cogneuro society announcement
From: Kimball Collins <kpc@ptolemy.arc.nasa.gov>
Date: Sun, 24 Oct 93 14:21:13 -0800
A tip from Mark Wessinger:
- ------- Start of forwarded message -------
************************************************************
ANNOUNCING THE FORMATION
OF A NEW SCIENTIFIC SOCIETY
AND CALL FOR ABSTRACTS
COGNITIVE NEUROSCIENCE SOCIETY
************************************************************
Inaugural Meeting
March 27-29, 1994
Fairmont Hotel
San Francisco, CA
Scheduled Symposia:
Sunday Mar 27th
Mike Gazzaniga, (Chair) "Cognitive Neuroscience"
(w/ Bizzi, Konishi, Newsome)
Daniel Schacter (Chair) "Memory Systems"" (w/ Nadel, Shapiro, Squire,
Tulving)
Monday Mar 28th
Mike Posner (Chair) "Plasticity and Acquisition of Expertise" (w/ Neville,
Curran, Petersen, McCandliss)
Terrence Sejnowski "Adaptive Cortical Representations" (w/ Merzenich,
Ramachandran, Montague)
Tuesday Mar 29th
George R. Mangun "Mechanisms and Models of Attention" (w/ Treisman,
Hillyard, Reuter-Lorenz, Corbetta)
Richard Andersen "Modeling Methods for Neural Algorithms" (w/ Koch, Mead,
Douglas)
Abstracts for Poster Presentations must be received
by January 1, 1994
Meeting Administration/Reg Fee $25: Poster Abstract Fee $15
Hotel Reservation (meeting hotel only)
Fairmont Hotel
950 Mason Street
San Francisco, CA 94108
Tel. (800) 527-4727 (reservations)
Tel. (415) 772-5000
FAX (415) 772-5013
(Room rate for double or single $125 for meeting)
For more information write:
Cognitive Neuroscience Society
Attn: Flo Batt
Center for Neuroscience, University of California
Davis, CA 95616 USA
FAX (916)757-8827
email: fabatt@ucdavis.edu
or see:
Journal of Cognitive Neuroscience, Fall Issue 1993 for
Abstract form and information.
****************************************************
- ------- End of forwarded message -------
------------------------------
Subject: Some informtion required..thanks in advance..
From: SANYAL@tifrvax.tifr.res.in
Date: Fri, 29 Oct 93 10:25:00 +0430
HELP REQUIRED
I am looking for complete details of the following proceedings:
1. Proceedings of the First International Workshop on Frontiers in
Handwriting Recognition, held in Montreal, Canada.
2. Proceedings of the Second International Workshop on Frontiers in
Handwriting Recognition, held in Bonas, France.
3. Proceedings of the Third International Workshop on Frontiers in
Handwriting Recognition, held in Buffalo, NY, USA.
We are interested in procuring these proceedings, therefore we require
details such as : Publisher, Editor name, price and other relevant
details.
Thanks in advance.
With warm regards,
S Sanyal.
From:
Dr S Sanyal,
CSC Group,
TIFR,
Bombay - 400005, INDIA.
email: sanyal@tifrvax.tifr.res.in
OR
sanyal@tifrvax.bitnet
------------------------------
Subject: Free Convolution Software (Educational Tool)
From: kk@ee.umr.edu
Date: Fri, 29 Oct 93 12:23:40 -0600
Contributed by: Kurt Kosbar <kk@ee.umr.edu>
FREE EDUCATIONAL SOFTWARE PACKAGE
P. C. CONVOLUTION
P.C. convolution is a educational software package that graphically
demonstrates the convolution operation. It runs on IBM PC type computers
using DOS 4.0 or later. It is currently being used at over 70 schools
throughout the world in departments of Electrical Engineering, Physics,
Mathematics, Chemical Engineering, Chemistry, Crystallography, Geography,
Geophysics, Earth Science, Acoustics, Phonetics & Linguistics, Biology,
Astronomy, Ophthalmology, Communication Sciences, Business, Aeronautics,
Biomechanics, Hydrology and Experimental Psychology.
Anyone may download a demonstration version of this software via anonymous
ftp from 131.151.4.11 (file name /pub/pc_conv.zip)
University instructors my obtain a free, fully operational version by
contacting Dr. Kurt Kosbar at the address listed below.
Dr. Kurt Kosbar
117 Electrical Engineering Building, University of Missouri - Rolla
Rolla, Missouri, USA 65401, phone: (314) 341-4894
e-mail: kk@ee.umr.edu
------------------------------
Subject: Combinatorial - Graph information
From: ARDESHIR <CIVAB@VAXA.HERIOT-WATT.AC.UK>
Date: Thu, 04 Nov 93 10:32:00 +0000
Some time ago I sent a requet asking for references relating to graph
partitioning, matching and combinatorial optimization using neural nets.
I wish to sincerely thank those few who responded in particular M.
Ohlsson, C. Peterson, B.Sderberg, T. Grossman, A. Jagota and R. Lister.
Using the received information and my own research the following references
have been compiled which I can only provide ftp addresses for some of them.
The following publications may be collected via ftp.
- --- ftp address: ftp.cs.buffalo.edu -> /users/jagota ----------
Arun Jagota, Efficiently Approximating Max-Clique in a Hopfield-style
Network, 1992.
Arun Jagota, Optimization by Reduction to Maximum Clique, 1993
Arun Jagota, A new Hopfield-style Network for Content-addressable Mmories,
1990.
Arun Jagota, The Hopfield-style Network as a Maximal-Clique Graph Machine,
1990.
----------------------------------------------
- ---ftp address: cis.ohio-state.edu -> /pub/neuroprose ----------
There is a wide range of papers relating to neural networks which you may
wish to choose from this address.
----------------------------------------------
- ---ftp address: svr-ftp.eng.cam.ac.uk -> /reports --------------
S. V. Aiyer, Solving Combinatorial Optimization Problems Using Neural Networks
with Application in Speech Recognition, 1989. (Note:This is a PhD thesis over
100 pages).
A. H. Gee, Problem Solving with Optimization Networks, 1993. (Note This is a
PhD thesis over 130 pages).
As well as above thises, there are several other papers which you may collect
by ftp.
--------------------------------------------------
The following is a list of publications regarding graphs and combinatorial
optimization using neural nets which I am not aware of their accessibility
by ftp service and you may have to order them through your library, etc.
N. Ohlsson, C. Peterson and B. Sderberg, Neural Networks for Optimization
with Inequality Constraints - the Knapsack Problem. Lund Preprint LU TP 92-11
(submitted to Neural Computation) (1991).
M. Ohlsson, C. Peterson and A. Yuille, Track Finding with Deformable
Templates - The Elastic Arms Approach, Lund Preprint LU TP 91-27 (to appear
in Computer Physics Communications) 1991.
L.Gisl
n, B. Sderberg, and C. Peterson, Scheduling High Schools with Neural
Networks, Lund Preprint LU TP 91-9 (submitted to Neural Computation) 1991.
L.Gisl
n, B. Sderberg, and C. Peterson, Teachers and Classes with Neural
Networks, International Journal of Neural Systems 1, 167, 1989.
C. Peterson, Parallel Distributed Approaches to Combinatorial Optimization
Problems - Benchmark Studies on TSP, Neural Computations 2, 261, 1990.
C. Peterson and B. Sderberg, A New Method for Mapping Optimization Problems
onto Neural Networks, NATURE 326, 689, 1987.
C. Peterson, J. R. Anderson, Neural Networks and NP-complete Optimization
Problems; A Performance Study on the Graph Bisection Problem, Complex Systems
Publications, Inc. 1988.
You may find other works from the authors from the reference sections of
the above papers.
---------------------------------------------------
T. Grossman, Applying the INN model to the MaxClique Problem, LA-UR-93-3082,
1993.
T. Grossman and A. Jagota, On the Equivalence of Two Hopfield-type Networks,
in the proc. of ICNN, pp. 1063-1068, (IEEE) 1993.
T. Grossman, The INN model as an associative memory, submitted, and PhD.
thesis, Weizmann Inst. of Science, 1992.
-------------------------------------------------------
D. E. Van Den Bout, T. K. Miller, III, Graph Partitioning Using Annealed
Neural Networks, IEEE Transactions on Neural Networks, Vol. 1, No. 2,
June 1990.
and you may find some more references from the above paper(s).
-----------------------------------------------------------
Although I am certian of many other publications, however I hope the above
mentioned references would be useful to everyone.
I myself have recently began researching about combinatorial optimization,
graph partitioning and matching using neural nets in order to apply them
in structural engineering problems.
Thank You.
Ardeshir Bahreininejad
Civil & Offshore Engineering Dept.
Heriot-Watt University
Riccarton, Edinburgh
EH14 4AS, Scotland
U.K.
------------------------------
Subject: ANNs in Medicine and Signal processing?
From: reksio!p011bb@reksio.pb.bialystok.pl (bartlomiej bulkszas)
Date: Thu, 04 Nov 93 16:44:17 -0500
Bartlomiej Bulkszas
Institut of Informatics
Zwierzyniecka 4/1012
15-333 Bialystok
E-mail: p011bb@reksio.pb.bialystok.pl
I would like to find some information on the aplications of neural
networks to Medicine and Signal Procesing. If there are books, jurnals,
papers, etc in this field, please let me know about it. Moreover, any
ftp address where there are documents, papers avaible about this field is
wellcome.
Thank you for your time,
Bart Bulkszas
------------------------------
Subject: new address and 2 questions
From: "Martin Vojacek" <VOJACEK@dolni-gw.ms.mff.cuni.cz>
Organization: MFF, Charles University, Prague
Date: Thu, 04 Nov 93 18:00:26 -0500
Can somebody help me?
I'm Martin Vojacek from Prague.
Exists another E-mail conference about neural nets or neural cells
(from mathematical, biological, theoretical, practical,... view)?
Do you know anything reference about literature of a threshold
adaptation, computing or learning of threshold?
Thank you very much for your effort.
Cooperation welcome.
Martin Vojacek vojacek@dolni.ms.mff.cuni.cz
vojacek@CsPgUk11.Bitnet
------------------------------
Subject: Thanking for replies
From: RAUL HECTOR GALLARD <gallardr@unslfm.edu.ar>
Date: Mon, 08 Nov 93 09:32:09 -0200
This mail is to publicly acknowledge to the coleagues that replied our
announcement requiring experienced lecturers and researchers for
postgraduate courses at the Universidad Nacional de San Luis, Argentina.
A significant number of diverse proposals arrived here, some of them
lamentably out of deadlines, others submitted now to the faculty board
for selection. Independently of the choose that will be done, I want to
remark that important contacts have been established by means of the NNs
digest service and we are very grateful also for that.
Sincerily yours
Prof. Raul Gallard.
Raul Hector Gallard
gallardr@unslfm.edu.ar
------------------------------
Subject: NN Parameters input/output control
From: Documentation_CRIQ@Infopuq.UQuebec.CA
Date: Mon, 08 Nov 93 16:02:24 -0500
We are looking for information and technical papers describing
experimental and or new development of NEURAL NETWORKS OR EXPERT SYSTEM
OR ADAPTIVE CONTROL AND COMPUTER particulary adapted to a FLOTATION
CIRCUIT in the MINERAL PROCESSING INDUSTRY. All other information
covering areas where they can be applied such as in CHEMICAL INDUSTRY
will be appreciated.
Rejean Corriveau
CRIQ- Canada
Documentation_criq@infopuq.uquebec.ca
------------------------------
Subject: PostDoctoral Fellowship.
From: P.Refenes@cs.ucl.ac.uk
Date: Tue, 26 Oct 93 21:14:23 -0100
Postdoctoral Fellowship
CALL FOR APPLICATIONS for a post doctoral research
fellowship on
NONLINEAR MODELLING IN FINANCIAL ENGINEERING
at: London Business School, Department of Decision Science.
Position: for upto 2 years (beginning Fall 1994; stipend:
$50,000 pa).
London Business School has been selected as one of the European
Host Institutes for the CEC Human Capital and Mobility Programme
and has been awarded a number of postdoctoral fellowships. The
NeuroForecasting Unit at the faculty of Decision Sciences has a
strong involvement in the application of neural networks to
financial engineering including asset pricing, tactical asset
allocation, equity investment, forex, etc. and would like to put
forward a candidate with a research proposal in neural network
analysis including parameter significance estimation in
multi-variate datasets, sensitivity analysis, and/or non-linear
dimentionality reduction in the context of factor models for
equity investment.
Candidates must hold a PhD in non-linear modelling or related
areas and have a proven research record. Normal HCM rules apply
i.e. only CEC nationals (excluding UK residents) are eligible.
CEC nationals that have been working overseas for the past two
years also qualify.
Interested candidates should send their curriculum vitae and
a summary of their research interests to:
Dr A. N. Refenes
NeuroForecasting Unit
Department of Decision Science
London Business School
Sussex Place, Regents Park,
London NW1 4SA, UK
Tel: ++ 44 (71) 262 50 50
Fax: ++ 44 (71) 724 78 75
------------------------------
Subject: NEUROSCIENCE/BIOMEDICAL ENGINEERING FACULTY POSITION BU
From: "Lucia M. Vaina" <VAINA@buenga.bu.edu>
Date: Tue, 26 Oct 93 21:21:14 -0500
BOSTON UNIVERSITY, Department of Biomedical Engineering has openings
for SEVERAL tenure-track faculty positions at the junior level.
Coumputational Vision, Medical Image Processing, Neuroengineering, are
among the areas of interest.
For details see the add in Science-- October 22 1993.
Applicants should submit a CV, a one page summary of research
interests, and names and addresses of at least three references to:
Herbert Voigt Ph.D. Chairman Department of Biomedical Engineering
College of Engineering Boston University 44 Cummington str Boston, Ma
02215-2407
Consideration will be given to applicants who already hold a PHD in a
field of engineering or related field (e.g. physics) and have had at least
one year of postdoctoral experience.The target starting date for
positions is September 1, 1994. Considerations of applications will
begin on November 1, 1993 and will continue until the positions are filled.
------------------------------
Subject: Postdoc Position Available
From: "Guido.Bugmann 2 Kerby Place"
<gbugmann@school-of-computing.plymouth.ac.uk>
Date: Fri, 29 Oct 93 12:58:01 +0000
Dear Connectionists,
following position has been advertised
recently:
- ---------------------------------------------
University of Plymouth, Faculty of Technology.
School of Computing
Neurodynamics Research Group
Postdoctoral Research Fellow
(Salary within the range 13.140 - 17.286 pounds/year)
In this SERC-funded, three-year project based at
the University of Plymouth, you will carry out an
investigation into a novel, biologically-inspired,
neural network based learning control system.
The research will be carried out under the joint
supervision of Professor Mike Denham (University
of Plymouth) and Professor John G. Taylor (Kings
College London).
You should have, or be in the process of completing,
a PhD, either in a closely related area of research,
eg neural systems, adaptive learning systems, control
systems, or in a relevant discipline, eg mathematics,
cognitive sciences, but in the later case you must be
able to demonstrate a strong interest in neural/adaptive
systems and control systems.
Please send a CV and two names of references
before the 15 November 1993 to:
Prof. Mike Denham
Neurodynamics Research Group
School of Computing
University of Plymouth
Plymouth PL4 8AA
United Kingdom
Phone (+44) 752 23 25 47/41
Fax (+44) 752 23 25 40
- -------------------------------
With my best regards
Dr. Guido Bugmann
(same address as above)
Phone (+44) 752 23 25 66/41
Fax (+44) 752 23 25 40
email: gbugmann@sc.plym.ac.uk
------------------------------
Subject: Position available
From: janetw@cs.uq.oz.au
Date: Mon, 01 Nov 93 14:57:39 -0500
The following advert is for a position in the Department of
Computer Science, at the University of Queensland, at the
highest level of the academic scale. It is open to any area
of computing, and hence may interest researchers in cognitive
science and/or neural networks.
UQ has a strong inter-disciplinary cognitive science program
between the departments of computer science, psychology, linguistics
and philosophy, and neural network research groups in computer
science, psychology and engineering.
The University is one of the best in Australia, and Brisbane has a
delightful climate, situated on the coastal plain between the
mountains and the sea.
Inquiries can be directed to Professor Andrew Lister, as mentioned
below, or I am happy to answer informal questions about the Department,
University or other aspects of academic life in Brisbane.
Janet Wiles
Departments of Computer Science and Psychology
University of Queensland QLD 4072 AUSTRALIA
email: janetw@cs.uq.oz.au
- -------------------------------
UNIVERSITY OF QUEENSLAND
PROFESSOR OF COMPUTER SCIENCE
The successful applicant will have an outstanding record of research
leadership and achievement in Computer Science. Teaching experience is
expected, as is demonstrable capacity for collaboration with industry
and attraction of external funds.
The appointee will be expected to contribute substantially to
Departmental research, preferably in a field which can exploit or
extend current strengths. He or she will also be expected to teach at
both undergraduate and postgraduate levels, and to contribute to
Departmental policy making. Capacity and willingness to assume the
Department headship at an appropriate time will be an important
selection criterion.
The Department is one of the strongest in Australia with 26 full-time
academic staff, including 5 other full Professors, over 40 research
staff, and 23 support staff. There are around 500 equivalent full-time
students, with a large postgraduate school including 55 PhD students.
The Department has been designated by the Federal Government as the Key
Centre for Teaching and Research in Software Technology. The
Department also contains the Software Verification Research Centre, a
Special Research Centre of the Australian Research Council, and is a
major partner in the Cooperative Research Centre for Distributed
Systems Technology.
Current research strengths include formal methods and tools for
software development, distributed systems, information systems,
programming languages, cognitive science, and algorithm design and
analysis.
Salary: $77,900 plus superannuation. A market loading may be payable
in some circumstances.
For further information please contact the Head of Department,
Professor Andrew Lister (lister@cs.uq.oz.au), 07-365 3168 or
international +61 7 365 3168.
Applications: (4 copies) should be made to the Director, Personnel
Services, The University of Queensland, St Lucia, Queensland 4072,
Australia.
Closing Date: 10 Jan 1994
------------------------------
Subject: C Programmer Wanted
From: "Eric B. Baum" <eric@research.nj.nec.com>
Date: Fri, 05 Nov 93 16:30:44 -0500
C Programmer Wanted.
Note- this job may be more interesting than most postdocs. May pay better
too, if successful applicant has substantial commercial experience.
Prerequisites: Experience in getting large programs to work.
Some mathematical sophistication, *at least* equivalent
of good undergraduate degree in math, physics, theoretical
computer science, or related field.
Salary: Depends on experience.
Job: Implementing various novel algorithms. For example, implementing an
entirely new approach to game tree search. Conceivably this could
lead into a major effort to produce a championship chess program
based on novel strategy, and on novel use of learning algorithms.
Another example, implementing novel approaches to Travelling Salesman
Problem. Another example, experiments with RTDP (TD learning.)
Algorithms are *not* exclusively neural.
These projects are at the leading edge of algorithm research,
so expect the work to be both interesting and challenging.
Term-contract position.
To apply please send cv, cover letter and list of references to:
Eric Baum, NEC Research Institute, 4 Independence Way, Princeton NJ 08540,
or PREFERABLY by internet to eric@research.nj.nec.com
Equal Opportunity Employer M/F/D/V
- -------------------------------------
Eric Baum
NEC Research Institute, 4 Independence Way, Princeton NJ 08540
PHONE:(609) 951-2712, FAX:(609) 951-2482, Inet:eric@research.nj.nec.com
------------------------------
Subject: Post-doc position at LBL Human Genome Center
From: ed@rtsg.ee.lbl.gov (Ed Theil)
Date: Thu, 11 Nov 93 12:05:03 -0800
Post-Doctoral Appointment - one year, with possibility of renewal
DUTIES - Essential
Working under general supervision in the Human Genome Computing Group, develop
new approaches and algorithms for improved automatic and interactive DNA
base calling using DSP or neural net techniques.
Responsibilities will include writing interface software or
enhancing existing programs for seamless data acquisition from commercial
sequencers; close interaction with biologists engaged in producing
the sequence data; adaptation and development of analytical and graphical
software, including artificial neural networks for base calling and analysis;
collaboration with other investigators working in this area; writing
documentation, research reports and papers.
QUALIFICATIONS: - Essential
Candidates must have strong analytical skills and a background in math,
statistics or the demonstrated equivalent; significant experience using Unix
workstations and the C programming language; excellent interpersonal
skills to engage in effective communication and collaboration.
Must be able to demonstrate a high level of
accomplishment on comparable scientific projects.
Candidates must have a Ph.D. in a relevant scientific field for this position.
- - Desirable:
A background in modern biology, especially genetics or molecular biology;
working knowledge of or experience in the Human Genome Project,
especially issues associated with automatic base calling;
previous experience programming and using neural networks;
prior work with C++, Smalltalk or other object-oriented programming languages.
SALARY RANGE: upper 30's to low 40's.
LOCATION: Lawrence Berkeley Laboratory is a DOE-funded National Laboratory
operated by the University of California in the hills above
the Berkeley campus.
PROJECT: LBL's Genome Computing Group works with biologists,
instrumentation engineers, and other computer scientists at LBL and around the
world to develop and support tools for genome resea
rch.
CONTACT: Dr. Edward Theil by e-mail via address above, or
M/S 46A-1123
Lawrence Berkeley Laboratory
Berkeley, CA 94720
------------------------------
End of Neuron Digest [Volume 12 Issue 15]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
1748; Sun, 14 Nov 93 18:44:16 EST
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Sun, 14 Nov 93 18:44:13 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
id AA28434; Sun, 14 Nov 93 18:43:36 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA08573; Sun, 14 Nov 93 18:18:52 EST
Posted-Date: Sun, 14 Nov 93 18:18:13 EST
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #16 (conferences)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Sun, 14 Nov 93 18:18:13 EST
Message-Id: <8568.753319093@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Sunday, 14 Nov 1993
Volume 12 : Issue 16
Today's Topics:
CONGRESS: COMPUTATIONAL MEDICINE AND PUBLIC HEALTH (long)
CFP -- IEEE APCCAS'94
IEEE ANNs for Signal Processing CFP
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: CONGRESS: COMPUTATIONAL MEDICINE AND PUBLIC HEALTH (long)
From: mwitten@hermes.chpc.utexas.edu
Date: Tue, 14 Sep 93 13:04:47 -0600
** NOTE CHANGES IN SUBMISSION AND REGISTRATION DEADLINES **
FIRST WORLD CONGRESS
ON COMPUTATIONAL MEDICINE, PUBLIC HEALTH AND
BIOTECHNOLOGY
24-28 April 1994
Hyatt Regency Hotel
Austin, Texas
- ----- (Feel Free To Cross Post This Announcement) ----
1.0 CONFERENCE OVERVIEW: With increasing frequency, computational
sciences are being exploited as a means with which to investigate
biomedical processes at all levels of complexity; from molecular to
systemic to demographic. Computational instruments are now used, not only
as exploratory tools but also as diagnostic and prognostic tools. The
appearance of high performance computing environments has, to a great
extent, removed the problem of increasing the biological reality of the
mathematical models. For the first time in the history of the field,
practical biological reality is finally within the grasp of the
biomedical modeler. Mathematical complexity is no longer as serious an
issue as speeds of computation are now of the order necessary to allow
extremely large and complex computational models to be analyzed. Large
memory machines are now routinely available. Additionally, high speed,
efficient, highly optimized numerical algorithms are under constant
development. As these algorithms are understood and improved upon, many
of them are transferred from software implementation to an implementation
in the hardware itself; thereby further enhancing the available
computational speed of current hardware. The purpose of this congress is
to bring together a transdisciplinary group of researchers in medicine,
public health, computer science, mathematics, nursing, veterinary
medicine, ecology, allied health, as well as numerous other disciplines,
for the purposes of examining the grand challenge problems of the next
decades. This will be a definitive meeting in that it will be the first
World Congress of its type and will be held as a follow-up to the very
well received Workshop On High Performance Computing In The Life Sciences
and Medicine held by the University of Texas System Center For High
Performance Computing in 1990.
Young scientists (graduate students, postdocs, etc.)
are encouraged to attend and to
present their work in this increasingly interesting
discipline. Funding is being solicited from NSF, NIH,
DOE, Darpa, EPA, and private foundations, as well as
other sources to assist in travel support and in the
offsetting of expenses for those unable to attend
otherwise. Papers, poster presentations, tutorials,
focused topic workshops, birds of a feather groups,
demonstrations, and other suggestions are also
solicited.
2.0 CONFERENCE SCOPE AND TOPIC AREAS: The Congress
has a broad scope. If you are not sure
whether or not your subject fits the Congress
scope, contact the conference organizers at one
of the addresses below.
Subject areas include but are not limited to:
*Visualization/Sonification
--- medical imaging
--- molecular visualization as a clinical
research tool
--- simulation visualization
--- microscopy
--- visualization as applied to problems
arising in computational molecular
biology and genetics or other non-traditional
disciplines
--- telemedicine
*Computational Molecular Biology and Genetics
--- computational ramifications of clinical
needs in the Human Genome, Plant Genome,
and Animal Genome Projects
--- computational and grand challenge problems in
molecular biology and genetics
--- algorithms and methodologies
--- issues of multiple datatype databases
*Computational Pharmacology, Pharmacodynamics,
Drug Design
*Computational Chemistry as Applied to Clinical Issues
*Computational Cell Biology, Physiology,
and Metabolism
--- Single cell metabolic models (red blood cell)
--- Cancer models
--- Transport models
--- Single cell interaction with external factors
models (laser, ultrasound, electrical stimulus)
*Computational Physiology and Metabolism
--- Renal System
--- Cardiovascular dynamics
--- Liver function
--- Pulmonary dynamics
--- Auditory function, coclear dynamics, hearing
--- Reproductive modeling: ovarian dynamics,
reproductive ecotoxicology, modeling the
hormonal cycle
--- Metabolic Databases and metabolic models
*Computational Demography, Epidemiology, and
Statistics/Biostatistics
--- Classical demographic, epidemiologic,
and biostatistical modeling
--- Modeling of the role of culture, poverty,
and other sociological issues as they
impact healthcare
--- Morphometrics
*Computational Disease Modeling
--- AIDS
--- TB
--- Influenza
--- Statistical Population Genetics Of Disease
Processes
--- Other
*Computational Biofluids
--- Blood flow
--- Sperm dynamics
--- Modeling of arteriosclerosis and related
processes
*Computational Dentistry, Orthodontics, and
Prosthetics
*Computational Veterinary Medicine
--- Computational issues in modeling non-human
dynamics such as equine, feline, canine dynamics
(physiological/biomechanical)
*Computational Allied Health Sciences
--- Physical Therapy
--- Neuromusic Therapy
--- Respiratory Therapy
*Computational Radiology
--- Dose modeling
--- Treatment planning
*Computational Surgery
--- Simulation of surgical procedures in VR worlds
--- Surgical simulation as a precursor to surgical
intervention
--- The Visible Human
*Computational Cardiology
*Computational Nursing
*Computational Models In Chiropractice
*Computational Neurobiology and Neurophysiology
--- Brain modeling
--- Single neuron models
--- Neural nets and clinical applications
--- Neurophysiological dynamics
--- Neurotransmitter modeling
--- Neurological disorder modeling (Alzheimer's
Disease, for example)
--- The Human Brain Project
*Computational Models of Psychiatric and Psychological
Processes
*Computational Biomechanics
--- Bone Modeling
--- Joint Modeling
*Computational Models of Non-traditional Medicine
--- Acupuncture
--- Other
*Computational Issues In Medical Instrumentation
Design and Simulation
--- Scanner Design
--- Optical Instrumentation
*Ethical issues arising in the use of computational
technology in medical diagnosis and simulation
*The role of alternate reality methodologies
and high performance environments in the medical and
public health disciplines
*Issues in the use of high performance computing
environments in the teaching of health science
curricula
*The role of high performance environments
for the handling of large medical datasets (high
performance storage environments, high performance
networking, high performance medical records
manipulation and management, metadata structures
and definitions)
*Federal and private support for transdisciplinary
research in computational medicine and public health
3.0 CONFERENCE COMMITTEE
*CONFERENCE CHAIR: Matthew Witten, UT System Center
For High Performance Computing, Austin, Texas
m.witten@chpc.utexas.edu
*CURRENT CONFERENCE DIRECTORATE:
Regina Monaco, Mt. Sinai Medical Center
Dan Davison, University of Houston
Chris Johnson, University of Utah
Lisa Fauci, Tulane University
Daniel Zelterman, University of Minnesota Minneapolis
James Hyman, Los Alamos National Laboratory
Richard Hart, Tulane University
Dennis Duke, SCRI-Florida State University
Sharon Meintz, University of Nevada Los Vegas
Dean Sittig, Vanderbilt University
Dick Tsur, UT System CHPC
Dan Deerfield, Pittsburgh Supercomputing Center
Istvan Gyori, University of Veszprem (Hungary)
Don Fussell, University of Texas at Austin
Ken Goodman, University Of Miami School of Medicine
Martin Hugh-Jones, Louisiana State University
Stuart Zimmerman, MD Anderson Cancer Research Center
John Wooley, DOE
Sylvia Spengler, University of California Berkeley
Robert Blystone, Trinity University
Gregory Kramer, Santa Fe Institute
Franco Celada, NYU Medical Center
David Robinson, NIH, NHLBI
Jane Preson, MCC
Peter Petropoulos, Brooks Air Force Base
Marcus Pandy, University of Texas at Austin
George Bekey, University of Southern California
Stephen Koslow, NIH, NIMH
Fred Bookstein, University of Michigan Ann Arbor
Dan Levine, University of Texas at Arlington
Richard Gordon, University of Manitoba (Canada)
Stan Zeitz, Drexel University
Marcia McClure, University of Nevada Las Vegas
Renato Sabbatini, UNICAMP/Brazil (Brazil)
Hiroshi Tanaka, Tokyo Medical and Dental University (Japan)
Shusaku Tsumoto, Tokyo Medical and Dental University (Japan)
Additional conference directorate members are
being added and will be updated on the anonymous
ftp list as they agree.
4.0 CONTACTING THE CONFERENCE COMMITTEE: To contact
the congress organizers for any reason use any of the
following pathways:
ELECTRONIC MAIL - compmed94@chpc.utexas.edu
FAX (USA) - (512) 471-2445
PHONE (USA) - (512) 471-2472
GOPHER: log into the University of Texas System-CHPC
select the Computational Medicine and Allied Health
menu choice
ANONYMOUS FTP: ftp.chpc.utexas.edu
cd /pub/compmed94
POSTAL:
Compmed 1994
University of Texas System CHPC
Balcones Research Center
10100 Burnet Road, 1.154CMS
Austin, Texas 78758-4497
5.0 SUBMISSION PROCEDURES: Authors must submit 5
copies of a single-page 50-100 word abstract clearly
discussing the topic of their presentation. In
addition, authors must clearly state their choice of
poster, contributed paper, tutorial, exhibit, focused
workshop or birds of a feather group along with a
discussion of their presentation. Abstracts will be
published as part of the preliminary conference
material. To notify the congress organizing committee
that you would like to participate and to be put on
the congress mailing list, please fill out and return
the form that follows this announcement. You may use
any of the contact methods above. If you wish to
organize a contributed paper session, tutorial
session, focused workshop, or birds of a feather
group, please contact the conference director at
mwitten@chpc.utexas.edu . The abstract may be submitted
electronically to compmed94@chpc.utexas.edu or
by mail or fax. There is no official format.
6.0 CONFERENCE DEADLINES AND FEES: The following deadlines
should be noted:
1 November 1993 - Notification of intent to organize
a special session
15 December 1993 - Abstracts for talks/posters/
workshops/birds of a feather
sessions/demonstrations
15 January 1994 - Notification of acceptance of
abstract
15 February 1994 - Application for financial aid
1 April 1994 - Registration deadline
(includes payment of all fees)
Fees include lunches for three days, all conference
registration materials, the reception, and the sit
down banquet:
$400.00 Corporate
$250.00 Academic
$150.00 Student
Students are required to submit verification of student
status. The verification of academic status form appears
appended to the registration form in this announcement.
Because financial aid may be available for minority
students, faculty, and for individuals from declared
minority institutions, you may indicate that you are
requesting financial aid as a minority individual.
Additionally, we anticipate some support for women to
attend. Application for financial aid is also appended
to the attached form.
7.0 CONFERENCE PRELIMINARY DETAILS AND ENVIRONMENT
LOCATION: Hyatt Regency Hotel, Austin, Texas, USA
DATES: 24-28 April 1994
The 1st World Congress On Computational Medicine,
Public Health, and Biotechnology will be held at the
Hyatt Regency Hotel, Austin, Texas located in
downtown Austin on the shores of Town Lake, also
known as the Colorado River. The Hyatt Regency has
rooms available for the conference participants at
a special rate of $79.00/night for single or double
occupancy, with a hotel tax of 13%. The Hyatt accepts
American Express, Diner's Club, Visa, MasterCard,
Carte Blanche, and Discover credit cards. This room
rate will be in effect until 9 April 1994 or until
the block of rooms is full. We recommend that you make
your reservations as soon as possible. You may make
your reservations by calling (512) 477-1234 or by
returning the enclosed reservation form. Be certain
to mention that you are attending the First World
Congress On Computational Medicine, Public Health,
and Biotechnology if you make your reservations by
telephone.
The hotel is approximately, five miles (15 minutes
from Robert Mueller Airport). The Hyatt offers
courtesy limousine service to and from the airport
between the hours of 6:00am and 11:00pm. You may call
them from the airport when you arrive. If you choose
to use a taxi, expect to pay approximately $8.00.
Automobiles may be rented, at the airport, from most
of the major car rental agencies. However, because of
the downtown location of the Congress and access to
taxis and to bus service, we do not recommend that you
rent an auto unless you are planning to drive
outside of the city.
Should you not be able to find an available room
at the Hyatt Regency, we have scheduled an "overflow"
hotel, the Embassy Suites, which is located directly
across the street from the Hyatt Regency. If, due to
travel expense restrictions, you are unable to stay
at either of these two hotels, please contact the
conference board directly and we will be more than
happy to find a hotel near the conference site that
should accommodate your needs.
Austin, the state capital, is renowned for its natural
hill-country beauty and an active cultural scene.
Several hiking and jogging trails are within walking
distance of the hotel, as well as opportunities for a
variety of aquatic sports. From the Hyatt, you can
"Catch a Dillo" downtown, taking a ride on our
delightful inner-city, rubber-wheeled trolley system.
In Austin's historic downtown area, you can take a
free guided tour through the State Capitol Building,
constructed in 1888. Or, you can visit the Governor's
Mansion, recognized as one of the finest examples of
19th Century Greek Revival architecture and housing
every Texas governor since 1856. Downtown you will
find the Old Bakery and Emporium, built by Swedish
immigrant Charles Lundberg in 1876 and the Sixth
Street/Old Pecan Street Historical District - a
seven-block renovation of Victorian and native stone
buildings, now a National Registered Historic District
containing more than 60 restaurants, clubs, and
shops to enjoy. The Laguna Gloria Art Museum, the
Archer M. Huntington Art Gallery, the LBJ Library and
Museum, the Neill-Cochran Museum House, and the Texas
Memorial Museum are among Austin's finest museums.
The Umlauf Sculpture Garden, has become a major
artistic attraction. Charles Umlauf's sculptured works
are placed in a variety of elegant settings under a
canopy of trees. The Zilker Gardens contains many
botanical highlights such as the Rose Garden, Oriental
Garden, Garden of the Blind, Water Garden and more.
Unique to Austin is a large population of Mexican
free-tailed bats which resides beneath the Congress
Avenue Bridge. During the month of April, the Highland
Lakes Bluebonnet Trail celebrates spring's wildflowers
(a major attraction) as this self-guided tour winds
through the surrounding region of Austin and nearby
towns (you will need to rent a car for this one).
Austin offers a number of indoor shopping malls in
every part of the city; The Arboretum, Barton Creek
Square, Dobie Mall, and Highland Mall, to name a few.
Capital Metro, Austin's mass transit system, offers
low cost transportation throughout Austin. Specialty
shops, offering a wide variety of handmade crafts and
merchandise crafted by native Texans, are scattered
throughout the city and surrounding areas.
Dining out in Austin, you will have choices of
American, Chinese, Authentic Mexican, Tex-Mex,
Italian, Japanese, or nearly any other type of cuisine
you might wish to experience, with price ranges that
will suit anyone's budget. Live bands perform in
various nightclubs around the city and at night spots
along Sixth Street, offering a range of jazz, blues,
country/Western, reggae, swing, and rock music.
Day temperatures will be in the 80-90(degrees F) range
and fairly humid. Evening temperatures have been known
to drop down into the 50's (degrees F). Cold weather
is not expected so be sure to bring lightweight
clothing with you. Congress exhibitor and vendor
presentations are also being planned.
8.0 CONFERENCE ENDORSEMENTS AND SPONSORSHIPS:
Numerous potential academic sponsors have been
contacted. Currently negotiations are underway
for sponsorship with SIAM, AMS, MAA, IEEE, FASEB, and
IMACS. Additionally AMA and ANA continuing medical
education support is being sought. Information
will be updated regularly on the anonymous ftp
site for the conference (see above). Currently,
funding has been generously supplied by the following
agencies:
University of Texas System - CHPC
U.S. Department of Energy
================== REGISTRATION FORM ===============
(Please list your name below as it will appear on badge.)
First Name :
Middle Initial (if available):
Family Name:
Your Professional Title:
[ ]Dr.
[ ]Professor
[ ]Mr.
[ ]Mrs.
[ ]Ms.
[ ]Other:__________________
Office Phone (desk):
Home/Evening Phone (for emergency contact):
Fax:
Electronic Mail (Bitnet):
Electronic Mail (Internet):
Postal Address:
Institution or Center:
Building Code:
Mail Stop:
Street Address1:
Street Address2:
City:
State:
Zip or Country Code:
Country:
Please list your three major interest areas:
Interest1:
Interest2:
Interest3:
Registration fee: $____________
Late fee $50 (if after April 1, 1994) $____________
**IF UT AUSTIN, PLEASE PROVIDE YOUR:
UNIVERSITY ACCT. #: ______________________
UNIVERSITY ACCT. TITLE: ______________________
NAME OF ACCT. SIGNER: ______________________
=====================================================
VERIFICATION OF STUDENT STATUS
Name:
Mailing Address:
University at which you are a student:
What level student(year):
Your student id number:
Name of your graduate or postdoctoral advisor:
Telephone number for your advisor:
By filling in this section, I agree that I am electronically
signing my signature to the statement that I am currently
a student at the above university.
=======================================================
REQUEST FOR FINANCIAL AID
Name:
Mailing Address:
I request financial assistance under one or more
of the following categories:
[ ] Student (You must fill out the Verification of Student
Status Section in order to be considered for
financial aid under this category)
[ ] Academic
[ ] Minority
[ ] Female
[ ] Black
[ ] Hispanic
[ ] Native American Indian
[ ] Other
This form is not meant to invade your personal privacy in
any fashion. However, some of the grant funds are targeted
at specific ethnic/minority groups and need to be expended
appropriately. None of these forms will be in any way
released to the public. And, after the congress, all of
the financial aid forms will be destroyed. No records will
be kept of ethnic or racial backgrounds.
If you have any questions concerning financial aid support,
please contact Matthew Witten at the above addresses.
==============================================================
------------------------------
Subject: CFP -- IEEE APCCAS'94
From: cww@15k.ee.nthu.edu.tw (Cheng-Wen Wu)
Date: Tue, 02 Nov 93 10:06:13 +0800
Second Call For Papers
APCCAS'94
IEEE ASIA-PACIFIC CONFERENCE ON
CIRCUITS AND SYSTEMS
Dec. 5-8, 1994
Grand Hotel, Taipei
THEMES
Circuits and Systems in Communication and Signal Processing
SOLICITED TOPICS
Analog and Active Filters, Adaptive Filters, Digital Filters,
Speech Processing, Video Signal Processing, HDTV Coding,
Communication Circuits, High Speed Electronics,
Neural Network Applications, B-ISDN,
Broadband Switching, Video Communication Networks,
Fiber to the Home,
Mobile and Personal Communications,
VLSI for Communications, CAD for VLSI,
Communication Problems in Power Systems
CO-SPONSORED BY
IEEE Circuits and Systems Society,
IEEE Taipei Section,
IEEE Communications Society,
IEEE Signal Processing Society
INCORPORATING
National Chiao Tung University,
National Taiwan University,
National Tsing Hua University,
and Local Institutes
TIME TABLE
Deadline for submission: May 1, 1994
Notification of acceptance: July 31, 1994
Camera-ready copy: September 15, 1994
Prospective authors are invited to submit 6 copies of completed
manuscript in English, not longer than 20 A4 or 8.5" X 11" pages
to one of the Technical Program Co-chairs:
Dr. Tomonori Aoyama
Executive Manager
Intellectual Property Department
NTT, 1-6, Uchisaiwai-cho
1-Chome Chiyoda-ku, Tokyo 100, Japan
Fax:81-3-3509-3509
E-mail:tomonori@ntttsd.ntt.jp
Prof. Che-Ho Wei
National Chiao Tung University
Center for Telecommunications Research
Engineering Building 4, Hsin-Chu, Taiwan
Tel/Fax:+886-35-723283
E-mail:chwei@cc.nctu.edu.tw
Tutorial sessions will be organized on the day before the
conference.
Proposals are invited and can be submitted to (before May 1, 1994):
Prof. C. Bernard Shung
Department of Electronics Engineering
National Chiao Tung University
Hsin-Chu, Taiwan
Fax:+886-35-724361
E-mail:shung@johann.ee.nctu.edu.tw
Proposal for special sessions may be submitted to (before May
1,1994):
Prof. Soo-Chang Pei
Department of Electrical Engineering
National Taiwan University
Taipei, Taiwan
Tel:+886-2-363-5251
Fax:+886-2-363-8247
ORGANIZATION COMMITTEE
General Conference Chairperson
Dr. Chung-Yu Wu, National Chiao Tung Univ., Taiwan
Technical Program Co-chairs
Dr. Tomonori Aoyama, NTT, Japan
Dr. Che-Ho Wei, National Chiao Tung Univ., Taiwan
Special Sessions Chairperson
Dr. Soo-Chang Pei, National Taiwan Univ., Taiwan
Tutorial Chairperson
Dr. C. Bernard Shung, National Chiao Tung Univ., Taiwan
Local Arrangement Chairperson
Dr. Chein-Wei Jen, National Chiao Tung Univ., Taiwan
Finance Chairperson
Dr. Wen-Zen Shen, National Chiao Tung Univ., Taiwan
Publications Chairperson
Dr. Youn-Long Lin, National Tsing Hua Univ., Taiwan
Publicity Chairpersons
Drs. Hsueh-Ming Hang \& Kuei-Ann Wen, National Chiao Tung Univ.,
Taiwan
International Liaison Chairperson
Dr. Cheng-Wen Wu, National Tsing Hua Univ., Taiwan
International Steering Committee Chairperson
Dr. T. Ohtsuki, VP, IEEE Region 10, Waseda University, Japan
------------------------------
Subject: IEEE ANNs for Signal Processing CFP
From: hwang@pierce.ee.washington.edu (Jenq-Neng Hwang)
Date: Mon, 01 Nov 93 19:34:35 -0800
1994 IEEE WORKSHOP ON
NEURAL NETWORKS FOR SIGNAL PROCESSING
September 6-8, 1994 Ermioni, Greece
Sponsored by the IEEE Signal Processing Society
(In cooperation with the IEEE Neural Networks Council)
CALL FOR PAPERS
Thanks to the sponsorship of IEEE Signal Processing Society, the
co-sponsorship of IEEE Neural Network Council, and the partial support
from Intracom S.A. Greece, the fourth of a series of IEEE workshops on
Neural Networks for Signal Processing will be held at the Porto Hydra
Resort Hotel, Ermioni, Greece, in September of 1994. Papers are
solicited for, but not limited to, the following topics:
APPLICATIONS:
Image, speech, communications, sensors, medical, adaptive
filtering, OCR, and other general signal processing and pattern
recognition topics.
THEORIES:
Generalization and regularization, system identification, parameter
estimation, new network architectures, new learning algorithms, and
wavelet in NNs.
IMPLEMENTATIONS:
Software, digital, analog, and hybrid technologies.
Prospective authors are invited to submit 4 copies of extended summaries
of no more than 6 pages. The top of the first page of the summary should
include a title, authors' names, affiliations, address, telephone and
fax numbers and email address if any. Camera-ready full papers
of accepted proposals will be published in a hard-bound volume by IEEE
and distributed at the workshop. Due to workshop facility constraints,
attendance will be limited with priority given to those who submit
written technical contributions. For further information, please
contact Mrs. Myra Sourlou at the NNSP'94 Athens office,
(Tel.) +30 1 6644961, (Fax) +30 1 6644379, (e-mail) msou@intranet.gr.
Please send paper submissions to:
Prof. Jenq-Neng Hwang
IEEE NNSP'94
Department of Electrical Engineering, FT-10
University of Washington, Seattle, WA 98195, USA
Phone: (206) 685-1603, Fax: (206) 543-3842
SCHEDULE
Submission of extended summary: February 15
Notification of acceptance: April 19
Submission of photo-ready paper: June 1
Advanced registration, before: June 1
GENERAL CHAIR
John Vlontzos
INTRACOM S.A.
Peania, Attica, Greece
jvlo@intranet.gr
PROGRAM CHAIR
Jenq-Neng Hwang
University of Washington
Seattle, Washington, USA
hwang@ee.washington.edu
PROCEEDINGS CHAIR
Elizabeth J. Wilson
Raytheon Co.
Marlborough, MA, USA
bwilson@sud2.ed.ray.com
FINANCE CHAIR
Demetris Kalivas
INTRACOM S.A.
Peania, Attica, Greece
dkal@intranet.gr
PROGRAM COMMITTEE
Joshua Alspector (Bellcore, USA)
Les Atlas (U. of Washington, USA)
Charles Bachmann (Naval Research Lab. USA)
David Burr (Bellcore, USA)
Rama Chellappa (U. of Maryland, USA)
Lee Giles (NEC Research, USA)
Steve J. Hanson (Siemens Corp. Research, USA)
Yu-Hen Hu (U. of Wisconsin, USA)
Jenq-Neng Hwang (U. of Washington, USA)
Bing-Huang Juang (AT&T Bell Lab., USA)
Shigeru Katagiri (ATR Japan)
Sun-Yuan Kung (Princeton U., USA)
Gary M. Kuhn (Siemens Corp. Research, USA)
Stephanos Kollias (National Tech. U. of Athens, Greece)
Richard Lippmann (MIT Lincoln Lab., USA)
Fleming Lure (Kaelum Research Co., USA)
John Makhoul (BBN Lab., USA)
Richard Mammone (Rutgers U., USA)
Elias Manolakos (Northeastern U., USA)
Nahesan Niranjan (Cambridge U., UK)
Tomaso Poggio (MIT, USA)
Jose Principe (U. of Florida, USA)
Wojtek Przytula (Hughes Research Lab., USA)
Ulrich Ramacher (Siemens Corp., Germany)
Bhaskar D. Rao (UC San Diego, USA)
Andreas Stafylopatis (National Tech. U. of Athens, Greece)
Noboru Sonehara (NTT Co., Japan)
John Sorensen (Tech. U. of Denmark, Denmark)
Yoh'ichi Tohkura (ATR, Japan)
John Vlontzos (Intracom S.A., Greece)
Raymond Watrous (Siemens Corp. Research, USA)
Christian Wellekens (Eurecom, France)
Yiu-Fai Issac Wong (Lawrence Livermore Lab., USA)
Barbara Yoon (ARPA, USA)
------------------------------
End of Neuron Digest [Volume 12 Issue 16]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
6219; Thu, 18 Nov 93 23:25:48 EST
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Thu, 18 Nov 93 23:25:43 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
id AA28838; Thu, 18 Nov 93 23:25:17 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA13603; Thu, 18 Nov 93 21:50:33 EST
Posted-Date: Thu, 18 Nov 93 21:49:51 EST
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #17 (misc, jobs, queries, cybernetics, etc.)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Thu, 18 Nov 93 21:49:51 EST
Message-Id: <13597.753677391@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Thursday, 18 Nov 1993
Volume 12 : Issue 17
Today's Topics:
IWANNT-EPROCS - note that the correct public login is iwan_pub
Need refs on Spherical Harmonics
Oxford Connectionist Summer School
AI CD-ROM Revision 2 - press release
Adaptive Simulated Annealing (ASA) version 1.53
Neuron Submission
request for information
Academic position at USC
Re: Whence cybernetics
Vclamp and Cclamp programs?
Looking for a paper from some conference
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: IWANNT-EPROCS - note that the correct public login is iwan_pub
From: Bob Allen <rba@bellcore.com>
Date: Tue, 09 Nov 93 12:45:40 -0500
Subject: IWANNT'93 Electronic Proceedings
Electronic Proceedings for
1993 International Workshop on Applications of Neural Networks
to Telecommunications
1. Electronic Proceedings (EPROCS)
The Proceedings for the 1993 International Workshop on
Applications of Neural Networks to Telecommunications
(IWANNT'93) have been converted to electronic form and are
available in the SuperBook(TM) document browsing system. In
addition to the IWANNT'93 proceedings, you will be able to
access abstracts from the 1992 Bellcore Workshop on
Applications of Neural Networks to Telecommunications and
pictures of several of the conference attendees.
We would appreciate your feedback about the use of this
system. In addition, if you have questions, or would like a
personal account, please contact Robert B. Allen
(iwannt_allen@bellcore.com or rba@bellcore.com).
2. Accounts and Passwords
Public access is available with the account name: iwan_pub
Individual accounts and passwords were given to conference
participants. Annotations made by iwan_pub may be edited by the
electonic proceedings editor.
3. Remote Access Via Xwindows
>From an Xwindow on machine connected to the Internet do the
following. Note that some locations may have "firewall"
that prevents Xwindows' applications from running. If this
procedure fails, you may have to find a machine outside your
firewall or use the character-based interface (csb).
+ xhost +128.96.58.4 (Xwindows display permission for
superbook.bellcore.com)
+ telnet 128.96.58.4
+ (login)
+ TERM=xterms (it is important to use "xterms")
+ enter your email address
+ Figure out and enter your machine's IP address
(in /etc/hosts or ask an administrator)
+ gxsb (Xwindows version of SuperBook)
3.1 Overview of Xwindows SuperBook Commands
When you login to SuperBook, you will obtain a Library
Window. For the IWANNT proceedings, you should select the
IWANNT shelf, highlight "Applications of Neural Networks to
Telecommunications" and click "Open". The Text Window
should be placed on the right side of the screen and the
Table-of-Contents Window should be placed on the left.
These windows can be resized.
Table-of-Contents (TOC): Books, articles, and sections
within books can be selected by clicking in the TOC. If the
entry contains subsections, it will be marked with a "+".
Double-clicking on those entries expands them. Clicking on
an expanded entry closes it.
Text Window: The text can be scrolled one-line-at-a-time
with the Scroll Bar Arrows or a page-at-a-time by clicking
on the spaces immediately above or below the Slider.
Graphics: Figures, tables, and some equations are presented
as bitmaps. The graphics can be viewed by clicking on the
blue icons at the right side of the text which pops up a
bitmap-viewer. Graphics can be closed by clicking on their
"close" button. Some multilemdia applications have been
included, but these may not work correctly across the
Internet.
Searching: Terms can be searched in the text by typing them
into the Search-Window. Wild-card searches are possible as
term* You can also search by clicking on a term in the text
(to clear that search and select another, do ctrl-click).
Annotations: Annotations are indicated with a pencil icon
and can be read by clicking on the icon. Annotations can be
created (with conference-attendee logins) by clicking in the
text with the left button and then typing in the annotation
window.
Exiting: Pull down the FILE menu on the Library Window to
"QUIT", and release.
4. Remote Access via character-based interface
>From any machine connected to the Internet do the following:
+ telnet 128.96.58.4 (for superbook.bellcore.com)
+ (login)
+ TERM=(termtype) (use "xterms" for an Xwindow inside
a firewall)
+ enter your email address
+ csb
4.1 Overview of csb SuperBook Commands
The character-based interface resembles emacs. You first
enter Library mode. After selecting a shelf (make sure you
are on the IWANNT shelf) and a book on that shelf (e.g.,
Applications of Neural Networks to Telecommunications), the
screen is split laterally into two parts. The upper window
is the TOC and the lower window has the text.
Table-of-Contents (TOC): Books, articles, and sections
within books can be selected by typing the number beside
them in the TOC. If the entry contains subsections, it will
be marked with a "+".
Text Window: The text can be scrolled one-line-at-a-time
with the u/d keys or a page-at-a-time with the U/D keys.
Graphics: Most bitmapped graphics will not be available.
Searching: Terms can be searched in the text by typing them
into the Search-Window. Wild-card searches are possible as
term* Searches are also possible by posting the cursor over
a word and hitting RET.
Annotations: Annotations are indicated with an A on the
right edge of the screen. These can be read by entering an
A on the line on which they are presented. Annotations can
be created (given correct permissions) by entering A on any
line.
Exiting: Enter "Q"
------------------------------
Subject: Need refs on Spherical Harmonics
From: slehar@copley.bu.edu (Steve Lehar)
Organization: Boston University Center for Adaptive Systems
Date: 10 Nov 93 18:10:59 +0000
RADICAL NEW BRAIN THEORY
I have come up with a very interesting new theory on the way the brain
represents spatial patterns by way of harmonic resonant interactions
between electrically coupled neurons. The big advantage of using such
resonances is that a simple physical system, for example a bell, is
capable of responding to and reproducing fantastically complex spatial
patterns, i.e. the harmonics of the bell, in a remarkably robust
manner using simple dynamical interactions, and can encode those
complex spatial patterns in a simple and highly compressed rotation
invariant code, i.e. the oscillation frequency corresponding to the
harmonic pattern. This is a radically new theory, and suggests an
entirely new mode of communication between neurons in the brain. I
have performed a number of computer simulations which have confirmed
the validity of the theory by accurately reproducing a large number of
visual phenomena by way of a single simple mechanism.
NEED REFERENCES ON SPHERICAL HARMONICS
I would now like to extend the orientational harmonic theory to
explain three-dimensional harmonics, in order to explore the kinds of
spatial structures that can be represented by this kind of system. In
order to do this, I need to read up on spherical harmonics. I have
found a number of books in the library full of fantastically elaborate
differential equations, etc, but that is NOT what I am looking for.
What I want is either a catalogue of the kinds of patterns represented
by the first few harmonics, or a SIMPLE description of how I could
write a computer program that would reproduce these patterns.
For example, some astronomy books explain how the sun is constantly
oscillating, or ringing like a bell, and the fundamental harmonics of
these oscillations include:
bulging all over, then shrinking all over, alternately
bulging in the northern hemisphere while shrinking in the
southern, and the inverse, alternately
bulging at the equator and shrinking at the poles...
bulging at the equatior and at a northern and southern "temperate"
lattitude, while shrinking elsewhere...
etc, etc...
in other words, this set of harmonics defines a series of
alternating subdivisions by lattitude. Another set of harmonics
defines a similar alternation by longitude, creating alternating
"orange slices". Combinations of these two modes of oscillation
produce checkerboard patterns, and so forth. Each of these patterns
corresponds to a particular temporal waveform or oscillation
frequency which therefore represents that pattern in a rotation
invariant manner. The bizzar patterns seen in electron orbitals are
yet another example of spherical harmonics. Does anybody know of a
reference that would list these different pattern types, or give
equations that are simple enough that I could type them into my
computer and generate families of these patterns myself? I have
neither the inclination nor the ability to plummet the depths of the
differential equations defining such systems, I only want to look at
the resultant patterns.
Can anybody out there help me?
- --
(O)((O))(((O)))((((O))))(((((O)))))(((((O)))))((((O))))(((O)))((O))(O)
(O)((O))((( slehar@park.bu.edu )))((O))(O)
(O)((O))((( Steve Lehar Boston University Boston MA )))((O))(O)
(O)((O))((( (617) 424-7035 (H) (617) 353-6741 (W) )))((O))(O)
(O)((O))(((O)))((((O))))(((((O)))))(((((O)))))((((O))))(((O)))((O))(O)
------------------------------
Subject: Oxford Connectionist Summer School
From: plunkett (Kim Plunkett) <@prg.ox.ac.uk:plunkett@dragon.psych>
Date: Thu, 11 Nov 93 18:43:43 +0000
UNIVERSITY OF OXFORD
MRC BRAIN AND BEHAVIOUR CENTRE
McDONNELL-PEW CENTRE FOR COGNITIVE NEUROSCIENCE
SUMMER SCHOOL ON CONNECTIONIST MODELLING
Department of Experimental Psychology
University of Oxford
11-23 September 1994
Applications are invited for participation in a 2-week
residential Summer School on techniques in connectionist
modelling of cognitive and biological phenomena. The course
is aimed primarily at researchers who wish to exploit neural
network models in their teaching and/or research. It will
provide a general introduction to connectionist modelling
through lectures and exercises on PCs. The instructors with
primary responsibility for teaching the course are Kim
Plunkett and Edmund Rolls.
No prior knowledge of computational modelling will be
required though simple word processing skills will be
assumed. Participants will be encouraged to start work on
their own modelling projects during the Summer School.
The Summer School is sponsored (jointly) by the University
of Oxford McDonnell-Pew Centre for Cognitive Neuroscience
and the MRC Brain and Behaviour Centre. The cost of parti-
cipation in the summer school is 500 pounds to include
accommodation (bed and breakfast at St. John's College) and
summer school registration. Participants will be expected to
cover their own travel and meal costs. A small number of
graduate student scholarships may be available. Applicants
should indicate whether they wish to be considered for a
graduate student scholarship but are advised to seek their
own funding as well, since in previous years the number of
graduate student applications has far exceeded the number of
scholarships available.
If you are interested in participating in the Summer School,
please contact:
Mrs. Sue King
Department of Experimental Psychology
University of Oxford
South Parks Road
Oxford OX1 3UD
Tel: (0865) 271353
Email: sking@uk.ac.oxford.psy
Please send a brief description of your background with an
explanation why you would like to attend the Summer School
(one page maximum) no later than 1 April 1994.
------------------------------
Subject: AI CD-ROM Revision 2 - press release
From: ncc@ncc.jvnc.net (R. Steven Rainwater)
Date: Sat, 13 Nov 93 17:48:32 -0600
[[ Editor's Note: I have edited this message heavily -- in fact, deleted
the press release. It is close to my threshold for a "commercial
announcement"(even if *does* contain the archives of Neuron Digest), so I
will let interested readers get the table of contents files and/or
contact the poster. The official retail cost for this CD-ROM is $129,
though it seems that most of the contents are generally distributable
without license. -PM ]]
We actually started shipping the Rev.2 discs a couple of months ago but due
to the number of orders already waiting to be filled we didn't want to send
out press releases until we'd caught up. Anyway, things seem to be going
along fine at this point, so here's the press release! The hard copies are
going out to the trade mags at about the same time, so they'll probably
start showing up in a month or so.
For people wanting to see a complete listing of the CD's contents, look for
the file AICDROM2.ZIP at an ftp site near you. The file is also available
from the Compuserve AI forum, and the NCC dial-up BBS at 214-258-1832. It
contains the file listing, this press release, a couple of magazine reviews
of the disc, and other assorted information. Oh, and one last note regarding
pricing - customers have told us that Programmer's Pardise seems to have the
best pricing in the US on the disc...
- -Steve Rainwater
============================================================================
Network Cybernetics Corporation -- Press release for electronic distribution
============================================================================
ANNOUNCING THE AI CD-ROM REVISION 2
Network Cybernetics Corporation is now shipping the second annual revision
of their popular AI CD-ROM, an ISO-9660 format CD-ROM containing a wide
assortment of information on AI, Robotics, and other advanced machine
technologies. The AI CD-ROM contains thousands of programs, source code
collections, tutorials, research papers, Internet journals, and other
resources. The topics covered include artificial intelligence, artificial
life, robotics, virtual reality, and many related fields. Programs for OS/2,
DOS, Macintosh, UNIX, Amiga, and other platforms can be found on the disc.
The files have been collected from civilian and government research centers,
universities, Internet archive sites, BBS systems and other sources. The
CD-ROM is updated annually to keep it current with the latest trends and
developments in advanced machine technologies such as AI.
[[ the rest of this text deleted for brevity. -PM ]]
------------------------------
Subject: Adaptive Simulated Annealing (ASA) version 1.53
From: Lester Ingber <ingber@alumni.cco.caltech.edu>
Date: Sun, 14 Nov 93 03:15:46 -0800
========================================================================
Adaptive Simulated Annealing (ASA) version 1.53
To get on or off the ASA email list, just send an email to
asa-request@alumni.caltech.edu with your request.
________________________________________________________________________
Significant CHANGES since 1.43 (17 Sep 93)
In General Information below, note the changes in the names of the
files containing the ASA code in the archive. The original filename
was decided by Netlib; with the code now in the Caltech archive, more
standard conventions can be used.
In asa.c, INT_ALLOC and INT_LONG defines had to be corrected around a
few fprintf statements. This may affect some PCs.
On a Sun, gcc-2.5.0 update runs were performed. Since the change in
definitions of MIN_DOUBLE and MAX_DOUBLE, setting SMALL_FLOAT=E-12 does
not agree with SMALL_FLOAT=E-18 and SMALL_FLOAT=E-20 (the latter two
agree), unless MIN_DOUBLE=E-18 also is set. The results diverge when
the parameter temperatures get down to the range of E-12, limiting the
accuracy of the SMALL_FLOAT=1.0E-12 run.
Added TIME_STD Pre-Compile Option to use unix-standard macros for time
routines, as required by some systems, e.g., hpux.
The ASA_TEMPLATE and ASA_TEST OPTIONS were added to permit easier use
of templates and test examples. Setting ASA_TEST to TRUE will permit
running the ASA test problem. Searching user.c for ASA_TEST also
provides a guide to the user for additional code to add for his/her own
system. Keeping the default of ASA_TEST set to FALSE permits such
changes without overwriting the test example. There are several
templates that come with the ASA code, used to test several OPTIONS.
To permit use of these OPTIONS without having to delete these extra
tests, these templates are wrapped with ASA_TEMPLATE.
________________________________________________________________________
ASA-Related Papers
The following two papers have used ASA to solve some very difficult
imaging problems that did not yield to other global optimization
techniques.
%A G. Blais
%A M.D. Levine
%T Registering multiview range data to create 3D computer objects
%R TR-CIM-93-16
%I Center for Intelligent Machines, McGill University
%C Montreal, Canada
%D 1993
This paper is a large file, 1.7 MBytes gzip'd, and there may soon be an
ftp site available for its retrieval. You can contact Gerard Blais
<gblais@mcrcim.mcgill.edu> for further information.
%A K. Wu
%A M.D. Levine
%T 3-D object representation using parametric geons
%R TR-CIM-93-13
%I Center for Intelligent Machines, McGill University
%C Montreal, Canada
%D 1993
This paper can be retrieved via anonymous ftp as
object_representation.ps.gz from ftp.caltech.edu in directory
pub/ingber/.limbo. (Read file INDEX in pub/ingber to retrieve files
from .limbo.) Contact Wu Kenong <wu@mcrcim.mcgill.edu> for further
information.
I believe that files should be stored in archives related more to a
relevant discipline than to the computational tools used to derive
results. So, while I can document ASA-related papers and actively
search for ftp sites to hold electronic (p)reprints, I cannot guarantee
storage for them in the Caltech ASA archive. The problem is that often
it is difficult to find archives for large files, e.g., containing
scanned figures. If anyone has authority to house large files to be
made available via anonymous ftp, please let me know what disciplines
you are willing to support.
________________________________________________________________________
Wall Street Journal
The reference to the article in the WSJ that mentioned the wide-spread
use of the ASA code is M. Wofsey, "Technology: Shortcut Tests Validity
of Complicated Formulas," The Wall Street Journal, vol. CCXXII, no. 60,
p. B1, 24 September 1993.
As I stated in the last general update: I gave the WSJ examples of
some projects using ASA, but I had to insist that the relevant people
would have to be contacted previous to citing them. Of course the
press has the last word on what they will publish/interpret.
________________________________________________________________________
General Information
The latest Adaptive Simulated Annealing (ASA) code and some related
(p)reprints in compressed PostScript format can be retrieved via
anonymous ftp from ftp.caltech.edu [131.215.48.151] in the pub/ingber
directory.
Interactively: ftp ftp.caltech.edu, [Name:] anonymous, [Password:]
your_email_address, cd pub/ingber, binary, ls or dir, get
file_of_interest, quit. The INDEX file contains an index of the other
files and information on getting gzip and unshar for UNIX, DOS and MAC
systems.
The latest version of ASA is ASA-x.y-shar.Z (x and y are version
numbers), linked to ASA-shar.Z. For the convenience of users who do
not have any uncompress/gunzip utility, there is a file ASA-shar which
is an uncompressed copy of ASA-x.y-shar.Z/ASA-shar.Z; if you do not
have sh or shar, you still can delete the first-column X's and separate
the files at the END_OF_FILE locations. For the convenience of some
users, there also is a current gzip'd tar'd version, ASA-x.y.tar.gz,
linked to ASA.tar.gz. There are patches ASA-diff-x1.y1-x2.y2.Z up to
the present version; these may be concatenated as required before
applying. Only current patches may be available. If you require a
specific patch that is not contained in the archive, contact
ingber@alumni.caltech.edu.
If you do not have ftp access, get information on the FTPmail service
by: mail ftpmail@decwrl.dec.com, and send only the word "help" in the
body of the message.
If any of the above are not possible, and if your mailer can handle
large files (please test this first), the code or papers you require
can be sent as uuencoded compressed files via electronic mail. If you
have gzip, resulting in smaller files, please state this.
Sorry, I cannot assume the task of mailing out hardcopies of code or
papers. My volunteer time assisting people with their their queries on
my codes and papers must be limited to electronic mail correspondence.
Lester
========================================================================
|| Prof. Lester Ingber 1-800-L-INGBER ||
|| Lester Ingber Research Fax: [10ATT]0-700-L-INGBER ||
|| P.O. Box 857 EMail: ingber@alumni.caltech.edu ||
|| McLean, VA 22101 Archive: ftp.caltech.edu:/pub/ingber ||
------------------------------
Subject: Neuron Submission
From: "K. Mansell" <csp35@teach.cs.keele.ac.uk>
Date: Tue, 16 Nov 93 18:32:48 +0000
[[ Editor's Note: As readers know, I publish "jobs offered" but not "jobs
wanted". I've asked this fellow to rewrite his request, since it seemed
like an academic "internship." In the U.S., such positions are usually
volunteer, unpaid, and for a limited duration. Well, here it is... -PM ]]
I am a physics graduate in the UK, currently studying for an MSc in Machine
Perception and Neurocomputing. The second half of my course involves a 6
month project and I am in the process of contacting companies in the UK who
are working in the area of neural computing to see if they would be
agreeable to me working on a project for them.
The level of commitment/support provided by the company is at the discretion
of the company itself. The placement I am looking for would preferably be
at the company's site although it need not be paid work. I am interested
in the application of neural nets to real scientific/technical problems
and would rather not work on defence projects.
I would be grateful for any information (contacts, areas of work etc)
relating to companies working with neural networks in the UK who
might consider offering me a project placement. In particular, I am trying
to locate a company called Neural Computer Systems (Sciences?) who have so
far eluded my searching.
Kevin Mansell
<csp35@teach.cs.keele.ac.uk>
------------------------------
Subject: request for information
From: moody_k@orgella.com
Date: Tue, 16 Nov 93 16:50:24 -0500
Greetings!
I have just joined the neural digest mailing list, and I have
been excited to see so much activity. I am a graduate student
at the University of New Hampshire in electrical engineering,
and a relative newcomer to neural network technology.
I have been researching neural networks as related to control
systems. and have found many references to work done by
Narendra and Parthasarathy which was published in IEEE
transactions in March 1990, where they proposed combining
recurrent and multilayer networks, and training with
dynamic backpropagation.
I am wondering if anyone could recommend more recent references
which might indicate what these proposals have lead to, and
what the present state of these techniquexs may be.
Thank you in advance.
Kris Moody
moody_k@orgella.uucp
------------------------------
Subject: Academic position at USC
From: Jean-Marc Fellous <fellous@rana.usc.edu>
Date: Tue, 16 Nov 93 14:24:57 -0800
Could you please post this announcement ....
ASSISTANT/ASSOCIATE PROFESSOR
BIOMEDICAL ENGINEERING/NEUROSCIENCE
UNIVERSITY OF SOUTHERN CALIFORNIA
A tenure-trace faculty position is available in the Department
of Biomedical Engineering at the University of Southern California.
This is a new position, created to strengthen the concentration of
neuroscience research within the Department. Applicants should be
capable of establishing an externally funded research program that
includes a rigorous, quantitative approach to functional aspects of
the nervous system. A combined theoretical and experimental approach
is preferred, though applicants withpurely theoretical research
programs will be considered. Multiple opportunities for
interdisciplinary research are fostered by USC academic and research
programs such as the Biomedical Simulations Resource, the Program in
Neu~science, and the Center for Neural Computing. Send curriculum
vitae, three letters of recommendation, and a description of current
and future research by January 1, 1994 to
Search Committee,
Department of Biomedical Engineerig,
530 Olin Hall,
University of Southern California,
Los Angeles, CA 90089-1451.
------------------------------
Subject: Re: Whence cybernetics
From: Jim Brakefield <braker@ennex1.eng.utsa.edu>
Date: Wed, 17 Nov 93 09:46:49 -0800
The Sept. 93 issue of Connections mentions the Neuron Digest discussion on the
AI/Cybernetic
history. I would like to comment.
I did a poster paper for IJCNN 1988 in which one topic was the development of
quantitative "scales"
from qualitative concepts (i.e like the emmergence of a temperature scale from
the three states of
matter). One example in the poster paper dealt with the emergence of two
scales:
The subject area is AI/connectionism/expert systems and the like. The two
scales are computation
requirements and memory requirements. The historical perspective is that early
AI attempted to
minimize both computation and memory. The result was theorem proving
(efficiency of the symbolic
and logic approach in both knowledge and calculation via derivation of
formulas). The next epoch
was Expert Systems which allowed a larger knowledge data base (greater
consumption of memory).
The current epoch of connectionism allows both large knowledge data and
extensive computation.
Thus the progression of "AI" paradgms has followed the historical improvement
in computer
ecconomy. The perspective is that of placing each AI algorithm at a position in
a two dimensional
space of computational requirements and memory requirements. The early failure
of the "cybernetic"
approach is thus a matter of its implementation requirements being ahead of its
time. One can wonder
if Minsky and Papert book was an emergent phenonena resulting from the computer
ecconomics
of its day.
James C. Brakefield braker@ennex1.eng.utsa.edu
------------------------------
Subject: Vclamp and Cclamp programs?
From: ilya@cheme.seas.upenn.edu (Ilya Rybak)
Date: Wed, 17 Nov 93 20:55:10 -0500
Dear friends,
I would like to get Vclamp and Cclamp programs for simulation of
single neuron that were developed and are distributing by
John Huguenard and David McCormick. I will be very thankful to
somebody for information (e-mail address) about how to order these
programs.
Ilya Rybak
ilya@cheme.seas.upenn.edu
------------------------------
Subject: Looking for a paper from some conference
From: "r.kelly" <DFCA4601G@UNIVERSITY-CENTRAL-ENGLAND.AC.UK>
Date: Thu, 18 Nov 93 17:06:38
Hello
I have been looking (unsuccessfully) for the following paper for some
time now :
D Nguyen, B Widrow, "Improving the learning speed of 2-layer neural
networks by choosing initial values of the adaptive weights",
International Conference of Neural Networks, July 1990
I don't have access to any Neural Network conference proceedings and
the ILL counter at my library is unable to track it down. Can anyone
tell me which International Conference is being referred to here?
Thanks,
Heather
Internet : dfca4601g%uk.ac.uce@nsf.ac.uk
Janet : dfca4601g@uk.ac.uce
------------------------------
End of Neuron Digest [Volume 12 Issue 17]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
0076; Fri, 19 Nov 93 19:44:10 EST
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Fri, 19 Nov 93 19:44:00 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
id AA27710; Fri, 19 Nov 93 19:43:04 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA10572; Fri, 19 Nov 93 19:03:39 EST
Posted-Date: Fri, 19 Nov 93 19:02:53 EST
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #18 (conferences and CFP)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Fri, 19 Nov 93 19:02:53 EST
Message-Id: <10552.753753773@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Friday, 19 Nov 1993
Volume 12 : Issue 18
Today's Topics:
URGENT: DEADLINE CHANGE FOR WORLD CONGRESS
Colloquium on Advances in Neurocontrol
Wavelet Transform Short Course at UCLA
Int'l Conf. on Evolutionary Computation / PPSN-94
NIPS Workshop
10th Israeli IAICVNN Symposium, 27-28 Dec.
FINAL call for papers for CBMS 94
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: URGENT: DEADLINE CHANGE FOR WORLD CONGRESS
From: mwitten@HERMES.CHPC.UTEXAS.EDU
Date: Wed, 03 Nov 93 11:31:42 -0600
UPDATE ON DEADLINES
FIRST WORLD CONGRESS ON COMPUTATIONAL MEDICINE, PUBLIC
HEALTH, AND BIOTECHNOLOGY
24-28 April 1994
Hyatt Regency Hotel
Austin, Texas
- ----- (Feel Free To Cross Post This Announcement) ----
Due to a confusion in the electronic distribution of the congress
announcement and deadlines, as well as incorrect deadlines appearing in a
number of society newsletters and journals, we are extending the abstract
submission deadline for this congress to 31 December 1993. We apologize
to those who were confused over the differing deadline announcements and
hope that this change will allow everyone to participate. For congress
details:
To contact the congress organizers for any reason use any of the
following pathways:
ELECTRONIC MAIL - compmed94@chpc.utexas.edu
FAX (USA) - (512) 471-2445
PHONE (USA) - (512) 471-2472
GOPHER: log into the University of Texas System-CHPC select the
Computational Medicine and Allied Health menu choice
ANONYMOUS FTP: ftp.chpc.utexas.edu
cd /pub/compmed94
(all documents and forms are stored here)
POSTAL:
Compmed 1994
University of Texas System CHPC
Balcones Research Center
10100 Burnet Road, 1.154CMS
Austin, Texas 78758-4497
SUBMISSION PROCEDURES: Authors must submit 5 copies of a single-page
50-100 word abstract clearly discussing the topic of their presentation.
In addition, authors must clearly state their choice of poster,
contributed paper, tutorial, exhibit, focused workshop or birds of a
feather group along with a discussion of their presentation. Abstracts
will be published as part of the preliminary conference material. To
notify the congress organizing committee that you would like to
participate and to be put on the congress mailing list, please fill out
and return the form that follows this announcement. You may use any of
the contact methods above. If you wish to organize a contributed paper
session, tutorial session, focused workshop, or birds of a feather group,
please contact the conference director at mwitten@chpc.utexas.edu . The
abstract may be submitted electronically to compmed94@chpc.utexas.edu or
by mail or fax. There is no official format.
If you need further details, please contact me.
Matthew Witten
Congress Chair
mwitten@chpc.utexas.edu
------------------------------
Subject: Colloquium on Advances in Neurocontrol
From: Rafal W Zbikowski <rafal@udcf.gla.ac.uk>
Date: Fri, 12 Nov 93 11:41:16 +0000
Contributed by: Ken Hunt <hunt@DBresearch-berlin.de>
CALL FOR PAPERS
---------------
IEE Colloquium on Advances in Neural Networks
for Control and Systems
26-27 May 1994
Daimler-Benz Systems Technology Research
Berlin, Germany
A colloquium on `Advances in neural networks for control and systems' is
being organised by the control committees of the Institution of
Electrical Engineers. This two-day meeting will be held on 26-27 May 1994
at Daimler-Benz Systems Technology Research in Berlin. The programme will
comprise a mix of invited papers and papers received in response to this
call. Invited speakers include leading international academic workers in
the field and major industrial companies who will present recent
applications of neural methods, and outline the latest theoretical
advances.
Neural networks have been seen for some years now as providing
considerable promise for application in nonlinear control and systems
problems. This promise stems from the theoretical ability of networks of
various types to approximate arbitrarily well continuous nonlinear
mappings.
The aim of this colloquium is to evaluate the state-of-the-art in this
very popular field from the engineering perspective. The colloquium will
cover both theoretical and applied aspects. A major goal of the workshop
will be to examine ways of improving the engineering involved in neural
network modelling and control, so that the theoretical power of learning
systems can be harnessed for practical applications. This includes
questions such as: which network architecture for which application? Can
constructive learning algorithms capture the underlying dynamics while
avoiding overfitting? How can we introduce a priori knowledge or models
into neural networks? Can experiment design and active learning be used
to automatically create 'optimal' training sets? How can we validate a
neural network model?
In line with this goal of better engineering methods, the colloquium will
also place emphasis on real industrial applications of the technology;
applied papers are most welcome.
Prospective authors are invited to submit three copies of a 500-word
abstract by Friday 25 February 1994 to Dr K J Hunt, Daimler-Benz AG,
Alt-Moabit 91 B, D-10559 Berlin, Germany (tel: + 49 30 399 82 275, FAX: +
49 30 399 82 107, E-mail: hunt@DBresearch-berlin.de).
------------------------------
Subject: Wavelet Transform Short Course at UCLA
From: "Watanabe, Nonie" <NWatanab@UNEX.UCLA.EDU>
Date: Fri, 12 Nov 93 17:42:00 -0800
Announcing a UCLA Extension Short Course...
Wavelet Transform: Techniques and Applications
March 7-11, 1994 at UCLA
OVERVIEW
For many years, the Fourier Transform (FT) has been used in a wide variety of
application areas, including multimedia compression of wideband ISDN for
telecommunications; lossless transform for fingerprint storage,
identification, and retrieval; an increased S/N ratio for target
discrimination in oil prospect seismic imaging; in-scale and
rotation-invariant pattern recognition in automatic target recognition; and
in-heart, tumor, and biomedical research.
This course describes a new technique, the Wavelet Transform (WT), that is
replacing the windowed FT in the applications mentioned above. The WT uses
appropriately matched bandpass kernels, called mother wavelets, thereby
enabling improved representation and analysis of wideband, transient, and
noisy signals. The principal advantages of the WT are 1) its localized nature
which accepts less noise and enhances the SNR, and 2) the new problem-solving
paradigm it offers in the treatment of nonlinear problems. The course covers
WT principles as well as adaptive techniques, describing how WTs mimic human
ears and eyes by tuning up "best mothers" to spawn "daughter" wavelets that
catch multi-resolution components to be fed the expansion coefficient through
an artificial neural network, called a wavenet. This in turn provides the
useful automation required in multiple application areas, a powerful tool
when the inputs are constrained by real-time sparse data (for example, the
"cocktail party" effect where you perceive a desired message from the
cacophony of a noisy party).
Another advancement discussed in the course is the theory and experiment for
solving nonlinear dynamics for information processing; e.g., the
environmental simulation as a non-real-time virtual reality. In other words,
real-time virtual reality can be achieved by the wavelet compression
technique, followed by an optical flow technique to acquire those wavelet
transform coefficients, then applying the inverse WT to retrieve the virtual
reality dynamical evolution. (For example, an ocean wave is analyzed by
soliton envelope wavelets.)
Finally, implementation techniques in optics and digital electronics are
presented, including optical wavelet transforms and wavelet chips.
COURSE MATERIALS
Course notes and relevant software are distributed on the first day of the
course. These notes are for participants only, and are not for sale.
COORDINATOR AND LECTURER
Harold Szu, PhD
Research Physicist, Washington, D.C. Dr. Szu's current research involves
wavelet transforms, character recognition, and constrained optimization
implementable on a superconducting optical neural network computer. He is
also involved with the design of a sixth-generation computer based on the
confluence of neural networks and new optical data base machines. Dr. Szu is
also a technical representative to DARPA and consultant to ONR on neural
networks and related research, and has been engaged in plasma physics and
optical engineering research for the past 16 years. He holds five patents,
has published about 100 technical papers, plus two textbooks. Dr. Szu is an
editor for the journal Neural Networks and currently serves as the President
of the International Neural Network Society.
LECTURER AND UCLA FACULTY REPRESENTATIVE
John D. Villasenor, PhD
Assistant Professor, Department of Electrical Engineering, School of
Engineering and Applied Science, UCLA. Dr. Villasenor has been instrumental
in the development of a number of efficient algorithms for a wide range of
signal and image processing tasks. His contributions include application-
specific optimal compression techniques for tomographic medical images,
temporal change measures using synthetic aperture radar, and motion
estimation and image modeling for angiogram video compression. Prior to
joining UCLA, Dr. Villasenor was with the Radar Science and Engineering
section of the Jet Propulsion Laboratory where he applied synthetic aperture
radar to interferometric mapping, classification, and temporal change
measurement. He has also studied parallelization of spectral analysis
algorithms and multidimensional data visualization strategies. Dr.
Villasenor's research activities at UCLA include still-frame and video
medical image compression, processing and interpretation of satellite remote
sensing images, development of fast algorithms for one- and two-dimensional
spectral analysis, and studies of JPEG-based hybrid video coding techniques.
DAILY SCHEDULE
Monday (Szu)
Introduction to Wavelet Transform (WT)
- - Formulation of small group projects using WT
--Theory, signal, image, sound, etc.
Review of WT
- - Historical: Haar 1910, Gabor 1942, Morlet 1985
- - Definition of WT
Applications: Principles by Dimensionality, Functionality
- - Signal processing: oil exploration, heart diagnosis
- - Image processing: lossless compression, fingerprint
- - Telecommunication: multi-medium wide-band ISDN
Discrete and Continuous Mathematics of WT
- - Example: Haar WT and Daubechies WT
- - Complexity Pyramid Theorem:
--Holy Grail: order (N) complexity for discrete WT
- - Connection with continuous WT
--Inverse CWT, Completeness Theorem
- - WT normalizations, causality conditions
Tuesday Morning (Villasenor)
Discrete Wavelet Transforms
- - Background: motivation, multiresolution analysis, Laplacian pyramid coding
- - Brief review of relevant digital signal processing concepts/notation
- - Discrete wavelet transforms in one dimension: conceptual background, QMF
filter banks, regularity, examples
Tuesday Afternoon (Villasenor and Szu)
Computer Laboratory Demonstration
- - Sound compression
- - Adaptive speech wavelet code
- - Image transforms using wavelets
Wednesday (Szu)
Adaptive Wavelet Transform
- - Practical examples: ears, eyes
- - Mathematics of optimization
- - Applications: cocktail party effect, hyperacuity paradox
Examples: Superposition Mother Wavelets
- - For phonemes
- - For speaker ID
- - For mine field
Nonlinear WT Applications: Soliton WT Kernel
- - Practical examples: ocean waves, cauchy sea states
- - Paradigms for solving nonlinear dynamics
--FT paradigm: FT first & mode-mode coupling
--WT paradigm: NL solution as mother wavelet that "enjoys" linear
superposition
Thursday (Villasenor)
Discrete Wavelet Transforms II
- - Wavelet filter design: ensuring regularity, tradeoffs in filter length,
filter evaluation criteria
- - 2D wavelet transforms and applications: extension of wavelets to two
dimensions, computational and practical considerations
- - Image compression: techniques for coding of wavelet transforms, comparison
with JPEG, extension to video coding
- - Future trends in image processing using wavelets
Friday (Szu)
Comparison
- - Quadrature mirror filter vs. perfect inverse image filter
--Regularity
--Decimation
--Sampling theorem
WT Implementation Issues
- - Optical WT
--Real-time image compression and transmission
- - WT chips
--WT butterfly
Advanced Applications in WT
- - Virtual reality
- - Environmental representation: surveillance planning
- - Real-time techniques
--Wavelet compression
--Optical flow of WT coefficients
--Inverse WT
Problem-Solving Methodology
- - Four principles for creative research
Research Project Presentations
- - Signal processing groups
- - Image processing groups
- - Implementation groups
Date: March 7-11 (Monday through Friday)
Time: 8 am-5 pm (subject to adjustment after the first class meeting),
plus optional evening sessions, times to be determined.
Location: Room G-33 West, UCLA Extension Building, 10995 Le Conte Avenue
(adjacent to the UCLA campus), Los Angeles, California
Course No. Engineering 867.121
Fee: $1495, includes course materials
To reserve a place in this course and/or request an application form, call
the UCLA Extension Short Course Program Office at (310) 825-3344; FAX (310)
206-2815.
------------------------------
Subject: Int'l Conf. on Evolutionary Computation / PPSN-94
From: maenner@mp-sun1.informatik.uni-mannheim.de (Reinhard Maenner)
Date: Mon, 15 Nov 93 20:49:56 +0100
INTERNATIONAL CONFERENCE ON EVOLUTIONARY COMPUTATION
THE THIRD PARALLEL PROBLEM SOLVING FROM NATURE (PPSN III)
JERUSALEM, ISRAEL, OCTOBER 9-14, 1994
FIRST ANNOUNCEMENT AND CALL FOR PAPERS
CONFERENCE COMMITTEE:
Y. Davidor H.-P. Schwefel R. Maenner
Conference Chair Programme Co-Chair Programme Co-Chair
C/O Ortra Ltd. Universitaet Dortmund Universitaet Mannheim
P.O. Box 50432 Lehrstuhl Informatik XI Lehrstuhl fuer Informatik V
Tel Aviv 61500, D-44221 Dortmund, D-68131 Mannheim,
Israel Germany Germany
Tel.: +972-3-664 825 Tel.: +49-231-755 4590 Tel.: +49-621-292 5758
Fax: +972-3-660 952 Fax: +49-231-755 2450 Fax: +49-621-292 5756
schwefel@LS11.informatik. maenner@mp-sun1.informatik.
uni-dortmund.de uni-mannheim.de
PPSN STEERING COMMITTEE:
Y. Davidor (Israel) B. Manderick (The Netherlands)
K. De Jong (USA) H. Muhlenbein (Germany)
H. Kitano (Japan) H.-P. Schwefel (Germany)
R. Maenner (Germany)
The International Conference on Evolutionary Computation - The Third
Parallel Problem Solving from Nature (PPSN III) will be held in Jerusalem,
Israel between 9-14 October, 1994. This meeting will bring together an
international community from academia, government and industry interested
in algorithms suggested by the unifying theme of natural computation.
Natural computation is a common name for the design, theoretical and
empirical understanding of algorithms gleaned from nature. Characteristic
for natural computation is the metaphorical use of concepts, principles and
mechanisms underlying natural systems. Examples are genetic algorithms,
evolutionary programming and evolution strategies inspired by the evolutionary
processes of mutation, recombination, and natural selection in biology,
simulated annealing inspired by many-particle systems in physics, and
algorithms inspired by multi-cellular systems like neural and immune networks.
Topics of particular interest include, but are not limited to: evolution
strategies, evolutionary programming, genetic algorithms and classifier
systems, other forms of evolutionary computation, simulated annealing, neural
and immune networks, machine learning and optimization using these methods,
their relations to other learning paradigms, and mathematical description of
their behaviour.
The conference programme committee will particularly welcome application
papers dealing with these techniques to solve real problems in manufacturing,
design, planning and engineering providing these are of the highest level. The
application type of papers should either exhibit outstanding performance in
solving particular problems in contrast to other techniques or address real
problems of significant and unique importance to science.
5 hard copies of original work in the related topics typed in 12pt single
column and maximal length of 10 pages including all figures and references
should be sent to H.-P. Schwefel, programme co-chair by March 1, 1994. One
copy should contain the names of the authors, affiliation, and full addresses.
The remaining 4 copies should be anonymous and contain only the title and body
of paper including figures and references. This procedure is adopted to
enhance anonymous peer review.
The conference will be held in a kibbutz 10 minutes from the Old City of
Jerusalem on top of the Judean mountain range overlooking Bethlehem and
Jerusalem. The conference programme will include visits to historical,
religious and contemporary monuments in Israel.
IMPORTANT DATES:
March 1, 1994 - Submission of full paper
May 30, 1994 - Notification to authors
July 1, 1994 - Submission of revised, final camera ready papers
GENERAL INFORMATION
Venue
The conference will be held at the Mitzpeh Rachel Kibbutz Congress Center on
the southern outskirts of Jerusalem, overlooking Bethlehem. A swimming pool
and tennis courts are on the premises and there is easy access by public
transportation to the center of Jerusalem.
Jerusalem is an excellent location for an international convention. Just 40
minutes from the Ben Gurion International Airport, Jerusalem offers a variety
of cultural and religious experiences that link its historic past to its
dynamic present.
History will come alive as you discover the shrines of the world's great
religions, stroll around the walls of the Old City, visit the reconstructed
city's main streets, and enjoy the extensive collections in Jerusalem's
numerous museums that house amongst its treasures, the Dead Sea Scrolls, the
Billy Rose sculpture garden, archaeological finds, calligraphy and other
works of art.
As the birthplace of Judaism and Christianity, and as one of Islam's holy
cities, Jerusalem is a captivating, uniquely significant city to millions of
people throughout the world. The conference will offer participants an
opportunity to combine a scientific gathering with the natural beauty of a
country that enjoys a pleasant Mediterranean climate and a unique city,
Jerusalem.
Language
The official language of the conference is English. All lectures, posters
and printed material will be in English.
Climate
The weather in October in Jerusalem is sunny and mild during the day. The
temperature is cooler in the evenings. Some rain may be expected, but not
very likely.
Clothing
Informal for all occasions. Do not forget to pack a swimsuit, head covering,
sunglasses and comfortable walking shoes. A jacket or sweater is recommended
for evenings.
Visas
Participants from most countries do not require entry visas. If needed, visas
will be granted to all bona fide participants provided that application to the
local representative of Israel is made at least three months before arrival in
Israel.
Social Program
A special program and excursions are planned for the participants of the
conference and their accompanying persons.
Second Announcement
Further information and the second announcement will be mailed upon request.
Please advise your colleagues who may be interested in participating in the
conference.
Travel, Tours and Accommodation
The conference committee has appointed Ortra Ltd. as the official organizer
and travel agent of the conference. Rooms have been reserved at the Mitzpeh
Rachel hotel (conference venue). Ortra Ltd. will offer pre/post conference
tourist services. Further information will be published in the second
announcement.
PLEASE SUBMIT YOUR INTENTION FORM NO LATER THAN JANUARY 10, 1994
==========================================================================
CUT HERE
==========================================================================
INTERNATIONAL CONFERENCE ON EVOLUTIONARY COMPUTATION
The Third Parallel Problem Solving From Nature (PPSN III)
Jerusalem, Israel, October 9-14, 1994
(Please return to Ortra Ltd., P.O.Box 50432, Tel-Aviv 61500, Israel
or by Fax to 972-3-660952)
INTENTION FORM
Surname: First Name:
Institution:
Address: []Institution []Home (please indicate)
e-mail: Fax. No.
[] I intend to participate in the conference.
[] Please send me the second announcement.
[] I wish to present a paper on:
[] Please find attached names and addresses of colleagues who may be
interested in attending the conference.
Signature: Date:
------------------------------
Subject: NIPS Workshop
From: "Christopher G. Atkeson" <cga@ai.mit.edu>
Date: Tue, 16 Nov 93 15:08:16 -0500
NIPS*93 Workshop: Memory-based Methods for Regression and Classification
=================
Intended Audience: Researchers interested in memory-based methods, locality
in learning
==================
Organizers:
===========
Chris Atkeson Tom Dietterich Andrew Moore Dietrich Wettschereck
cga@ai.mit.edu tgd@cs.orst.edu awm@cs.cmu.edu wettscd@cs.orst.edu
Program:
========
Local, memory-based learning methods store all or most of the training
data and predict new points by analyzing nearby training points (e.g.,
nearest neighbor, radial-basis functions, local linear methods). The
purpose of this workshop is to determine the state of the art in
memory-based learning methods and to assess current progress on
important open problems. Specifically, we will consider such issues
as how to determine distance metrics and smoothing parameters, how to
regularize memory-based methods, how to obtain error bars on
predictions, and how to scale to large data sets. We will also
compare memory-based methods with methods (such as multi-layer
perceptrons) that construct global decision boundaries or regression
surfaces, and we will explore current theoretical models of local
learning methods. By the close of the workshop, we will have
assembled an agenda of open problems requiring further research.
This workshop meets both days. Friday will be devoted primarily to
classification tasks, and Saturday will be devoted primarily to
regression tasks.
Please send us email if you would like to present something.
Current schedule:
Friday:
7:30-7:45 Introduction (Dietterich)
7:45-8:15 Leon Bottou
8:15-8:30 Discussion
8:30-9:00 David Lowe
9:00-9:30 Discussion
4:30-5:00 Patrice Simard
5:00-5:15 Discussion
5:15-5:45 Dietrich Wettschereck
5:45-6:15 John Platt
6:15-6:30 Discussion
Saturday:
7:30-8:00 Trevor Hastie
8:00-8:15 Discussion
8:15-8:45 Doyne Farmer
8:45-9:00 Discussion
9:00-9:30 5-minute descriptions by other participants
4:30-5:00 Chris Atkeson
5:00-5:30 Frederico Girosi
5:30-6:00 Andrew Moore
6:00-6:30 Discussion
------------------------------
Subject: 10th Israeli IAICVNN Symposium, 27-28 Dec.
From: Yaakov Stein <stein@mars.fiz.huji.ac.il>
Date: Thu, 18 Nov 93 12:49:12 +0200
The 10th Israeli Symposium on Artificial Intelligence,
Computer Vision and Neural Networks will be held
December 27'th and 28'th, at the convention center in
Kfar Maccabiah, Ramat Gan, Israel.
The conference, sponsored by the Information Processing
Association of Israel (IPA), will offer over fifty talks
and 4 invited lectures. All talks will be written up in
published proceedings, which will be supplied to all
participants, and available for purchase after the conference.
Both symposium days will commence with a plenary session, attended by
all participants, and thereafter separate into three parallel tracks.
The neural network track includes sessions on
Speech and Signal Processing,
Optical Character Recognition,
Hardware Implementations,
Biological and Medical Applications,
Architectures and Learning,
Expert Systems and Pattern Recognition,
Image Processing
and a panel discussion on industry and academia interaction.
Special emphasis will be placed on bridging the gap between
university and industrial research groups.
The symposium registration fee includes the proceedings, lunch
and coffee breaks. A block of rooms has been reserved at Kfar
Maccabiah Hotel, and are available on a first-come first-served
basis. Registration forms are available from
10th IAICVNN Secretariat
IPA
Kfar Maccabiah
Ramat Gan 52109
ISRAEL
(Tel 972-3-6771292
Fax 972-3-5744374)
Yaakov Stein
.
------------------------------
Subject: FINAL call for papers for CBMS 94
From: wes@sailboard.medeng.wfu.edu
Date: Fri, 19 Nov 93 10:11:09 -0500
CBMS-94
Advance Notice and Call for Papers
Computers in Medicine--Two Conferences, one location
The Seventh IEEE Symposium on Computer-Based Medical Systems
Friday-Saturday, June 10 - 11, 1994 with tutorials Saturday evening
and Sunday morning
Stouffer Hotel, Winston-Salem, NC
and the
12th Conference for Computer Applications in Radiology
Monday-Wednesday, June 13-15, 1994 with tutorials on Sunday
CBMS Sponsors
*IEEE Computer Society *IEEE Engineering in Medicine and Biology
Society
*The Winston-Salem Section of the IEEE
* with local support by the Bowman Gray School of Medicine
The Symposium is intended for engineers and computer scientists in academia
and industry who are designing and developing Computer-Based Medical Systems
(CBMS). Biomedical engineers, computer scientists, medical residents,
physicians, and students who are working on medical projects that involve
computers are encouraged to submit papers describing their work.
The conference is run this year in coordination with the annual SCAR
(Society for Computer Application in Radiology) meeting, starting on Sunday,
June 13, at the Winston-Salem Civic Center, next door to the Stouffer.
CBMS attendees will therefore have the opportunity to combine two excellent
conferences in one trip.
The Program
CBMS combines technical papers, poster presentations, panel discussions,
tutorials and research laboratory tours. Papers covering the following
related areas are requested:
*Device Reliability and Safety *Neural Networks and Expert Systems
fault-tolerance, device testing, theory, implementations,
validation and software safety pattern recognition, applications
*Image Processing and Analysis *Prosthetic Devices
registration, compression, Environmental control, word processing
enhancement, restoration, devices for the hearing and vision
reconstruction, hardware impaired, standards
*Signal Processing *Cardiovascular Technologies
algorithms, hardware, real-time monitoring, imaging, bioimpedance
processing, monitoring, EEG measurements, micro-computing,
computer applications,
cardiopulmonary resuscitation
*Information Systems *Clinical Assessment and Risk
Evaluation
RIS, HIS, PACS, networks, databases real-time signal processing,
database systems
Submission of Papers
Contributions in the forms of papers, poster sessions, software
demonstrations, and tutorials in the areas listed above are invited.
Paper summaries should be limited to two pages (typed, double-spaced) and
should include the title, names of authors, and the address and telephone
number of the corresponding author. Send four copies of your contributions to:
(Authors west of the Mississippi and Asia) Nassrin Tavakoli, Info Enterprises,
3260 N. Colorado Street, Chandler, AZ 85225-1123. or (Authors east of the
Missippi and Europe) Paul Kizakevich, Research Triangle Institute,
POBox 12194, Research Triangle Park, NC 27709.
Student Paper Contest
Student papers are invited and considered for the contest. Winners of the
contest will be selected by the Student Paper Contest Committee and awards
will be announced and made the symposium. Awards will consist of a
certificate and monetary prize as follows:
First Prize: $500; Second Prize: $300; Third Prize: $150.
To be eligible, the student must be the first author of an accepted paper,
and must present the paper at CBMS `94.
Deadlines and Key Dates
Paper summaries due: December 1, 1993 Notice of acceptance:
February 1, 1994
Camera ready papers due: March 15, 1994
_________________________________________
Wesley E. Snyder |
Professor of Electrical Engineering |
North Carolina State University |
Professor of Radiology |
Bowman Gray School of Medicine |
Email wes@relito.medeng.wfu.edu |
(919)-716-3908 |
_________________________________________|
------------------------------
End of Neuron Digest [Volume 12 Issue 18]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
0172; Fri, 19 Nov 93 20:29:36 EST
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Fri, 19 Nov 93 20:29:32 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
id AA29008; Fri, 19 Nov 93 20:28:58 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA11245; Fri, 19 Nov 93 19:48:30 EST
Posted-Date: Fri, 19 Nov 93 19:47:46 EST
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #19 (CFP for WCNN '94 conference)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Fri, 19 Nov 93 19:47:46 EST
Message-Id: <11237.753756466@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Friday, 19 Nov 1993
Volume 12 : Issue 19
Today's Topics:
WCNN 1994 CALL FOR PAPERS - Submission
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: WCNN 1994 CALL FOR PAPERS - Submission
From: Morgan Downey <70712.3265@CompuServe.COM>
Date: 08 Nov 93 15:46:48 -0500
WORLD CONGRESS ON NEURAL NETWORKS 1994 (WCNN '94)
Annual Meeting of the International Neural Network Society
Town and Country Hotel - San Diego, California, USA - June 4-9, 1994
CALL FOR PAPERS
deadline December 10, 1993
WHY WCNN '94?
INNS has incorporated many of your suggestions from earlier IJCNNs, WCNNs to
create the best conference on neural networks to date. These include:
* A more comprehensive learning experience. The new 2-day INNS University Short
Course format provides courses twice as long and in-depth as any other tutorial
program.
* The latest advances in the field. New sessions in chaos, mathematics,
neuroprocessing and virtual reality, prediction, and AI plus special session on
some of the latest applications of neural networks.
* A better opportunity to attend the talks you want to hear. The invited talks
at WCNN '94 sessions will be staggered and will not run in competition with each
other. You will not have to choose between invited talks.
* More opportunities to present your work. WCNN '94 has opened up new categories
for paper submission to encourage your participation. Read the Call for Papers
below.
* The latest software and hardware applications. The new Neural Network
Industrial Exposition will be a dedicated event demonstrating the latest
products in neural networks and hosted by industry experts.
* Many other improvements based on your responses to past meetings.
________________________________________________________________________
CALL FOR PAPERS:
Papers must be received by December 10, 1993. The deadline will NOT be extended.
Authors must submit registration payment with papers to be eligible for the
early registration fee. Six (6) copies (1 original, (5) copies) are required for
submission. Do not fold or staple originals. Six page limit, in English. $20 per
page for papers exceeding (6) pages (do not number pages). Checks for overlength
charges should be made out to INNS and Must be included with submitted paper.
Papers must be: camera-ready on 8 1/2 inch x 11 inch white paper with 1 inch
margins on all sides, one column format, single spaced, in Times or similar type
style of 10 points or larger, one side of paper only. FAX's not acceptable. By
submitting a previously unpublished paper, author agrees to the transfer of the
copyright to INNS for the conference proceedings. Centered at top of first page
should be complete title, author name(s), affiliation(s), and mailing
address(es), followed by blank space, abstract (up to 15 lines) and text. The
following information MUST be included in an accompanying cover letter in order
for the paper to be reviewed: Full title of paper, corresponding author(s) and
presenting author name, address, telephone and fax numbers, Technical Session
(see session topics) 1st and 2nd choices, oral and poster presentation
preferred, audio visual requirements (for oral presentations only) and INNS
Members number. Papers to be sent to the address at the end of this
announcement, Attention: Program Chairs. INNS members will be allowed a chance
for revision. In order to be eligible for revision, your cover letter MUST
include your INNS member number.
All members of INNS up to the paper deadline may designate up to one paper with
themselves as an author for automatic acceptance. This guarantees only a poster
presentation and abstract publication for that paper.
INNS members and other may submit papers already accepted by journals, or
publicly accessible Tech reports, in lieu of a paper, for consideration as the
basis of an oral presentation. In such a case, the members must submit at least
three copies of the relevant paper, along with an abstract of the talk which
clearly cites the paper well enough to permit easy access to it. When such talks
are accepted, then only the abstracts will be published.
When to choose a poster presentation: more than 15 minutes are need; author does
not wish to run concurrently with an invited talk; work is a continuation of
previously publication; author seeks a lively discussion opportunity or to
solicit future collaboration. NOTE: An award for Best Poster will be made for
each session.
________________________________________________________________________
SCHEDULE:
Saturday, June 4, 1994 and Sunday, June 5, 1994
INNS UNIVERSITY SHORT COURSES
Monday, June 6, 1994
NEURAL NETWORK INDUSTRIAL EXPOSITION
RECEPTION
OPENING CEREMONY
Tuesday, June 7, 1994
PARALLEL SESSIONS
EXHIBITS
SPECIAL SESSION ON "BIOMEDICAL APPLICATIONS"
SPECIAL SESSION ON "COMMERCIAL AND INDUSTRIAL APPLICATIONS"
PLENARY 1: LOTFI ZADEH
PLENARY 2: PER BAK
Wednesday, June 8, 1994
PARALLEL SESSIONS
EXHIBITS
SPECIAL SESSION ON "FINANCIAL AND ECONOMIC APPLICATIONS"
PLENARY 1: BERNARD WIDROW
PLENARY 2: MELANIE MITCHELL
SPECIAL INTEREST GROUP (SIGINNS) SESSIONS
Thursday, June 9, 1994
PARALLEL SESSIONS
EXHIBITS
SPECIAL SESSION ON "NEURAL NETWORKS IN CHEMICAL ENGINEERING"
SPECIAL SESSION ON "MIND, BRAIN, AND CONSCIOUSNESS"
PLENARY 1: PAUL WERBOS
PLENARY 2: JOHN TAYLOR
Friday, June 10, 1994 and Saturday, June 11, 1994
SATELLITE MEETING: WNN/FNN 94 SAN DIEGO - NEURAL NETWORKS AND FUZZY LOGIC
Sponsoring Society: NASA (National Aeronautics and Space Administration)
Cooperating: INNS, Society for Computer Simulation, SPIE and all other
interested societies.
For more information contact: Mary Lou Padgett, Auburn University, 1165 Owens
Road, Auburn AL 36830 ph: 205-821-2472 or 3488; fax: 205-844-1809; e-mail:
mpadgett@eng.auburn.edu -- NASA Rep: Robert Savely, NASA/JSC
______________________________________________________________
WCNN 1994 ORGANIZING COMMITTEE:
Paul Werbos, Chair Harold Szu Bernard Widrow
Liaison to the European Neural Network Society: John G. Taylor
Liaison to the Japanese Neural Network Society: Kunihiko Fukushima
PROGRAM COMMITTEE:
Daniel Alkon
Shun-ichi Amari
James A. Anderson
Richard Andersen
Kaveh Ashenayi
Andrew Barto
David Brown
Horacio Bouzas
Gail Carpenter
David Casasent
Ralph Castain
Joel Davis
Judith Dayhoff
Guido DeBoeck
David Fong
Walter Freeman
Kunihiko Fukushima
Michael Georgiopoulos
Lee Giles
Stephen Grossberg
Dan Hammerstrom
Harold Hawkins
Robert Hecht-Nielsen
Takayuki Ito*
Akira Iwata*
Robert Jannarone
Jari Kangas*
Christof Koch
Teuvo Kohonen
Bart Kosko
Clifford Lau
Hwang Soo Lee*
Soo-Young Lee
George Lendaris
Daniel Levine
Alianna Maren
Kenneth Marko
Thomas McAvoy
Thomas McKenna
Larry Medsker
Erkki Oja
Robert Pap
Rich Peterson
V.S. Ramachandran*
Gerhardt Roth
David Rumelhart*
Mohammed Sayeh
Michael Shadlen
Dejan Sobajic
William Softky
Harold Szu
John Taylor
Brian Telfer
Shiro Usui
John Weinstein
Bernard Widrow
Takeshi Yamakawa
Lotfi Zadeh
Mona Zaghloul
*Invited. More individuals may be asked to participate.
COOPERATING SOCIETIES/INSTITUTIONS:
American Association for Artificial Intelligence
American Institute for Chemical Engineers
American Physical Society
Center for Devices and Radiological Health, US Food and Drug Administration
Cognitive Science Society
European Neural Network Society
International Fuzzy Systems Association
Japanese Neural Network Society
Korean Neural Network Society
US National Institute of Allergy and Infectious Diseases
US Office of Naval Research
Society for Manufacturing Engineers
SPIE - The International Society for Optical Engineering
Division of Cancer Treatment, US National Cancer Institute
________________________________________________________________________
SESSIONS AND CHAIRS:
1 Biological Vision ... S. Grossberg, V.S. Ramachandran*
Invited Talk: Stephen Grossberg - Recent Results in Biological Vision
2 Machine Vision ... K. Fukushima, R. Hecht-Nielsen
Invited Talk: Kunihiko Fukushima - Visual Pattern Recognition with Selective
Attention
Invited Talk: Robert Hecht-Nielsen - Foveal Active Vision: Methods, Results, and
Prospects
3 Speech and Language ... D. Rumelhart*, T. Peterson
4 Biological Neural Networks ... T. McKenna, J. Davis
Session One: From Biological Networks to Silicon
Invited Speakers: Frank Werblin, UC Berkeley
Richard Granger, UC Irvine
Theodore Berger, USC
Session Two: Real Neurons in Networks
Invited Speakers: Jim Schwaber, DuPont
Misha Mahowald, Oxford University
David Stenger, NRL
Session Three: Networks for Motor Control and Audition
Invited Speakers: Randy Beer, Case Western Reserve
University
Daniel Bullock, Boston University
Shihab Shamma, University of Maryland
Session Four: Learning and Cognition and Biological Networks
Invited Speakers: Mark Gluck, Rutgers University
Nestor Schmajuk, Northwestern University
Michael Hasselmo, Harvard University
5 Neurocontrol and Robotics ... A. Barto, K. Ashenayi
6 Supervised Learning ... G. Lendaris, S-Y. Lee
Invited Talk: George Lendaris - Apriori Knowledge and NN Architectures
Invited Talk: Soo-Young Lee - Error Minimization, Generalization, and Hardware
Implementability of Supervised Learning
7 Unsupervised Learning ... G. Carpenter, R. Jannarone
Invited Talk: Gail Carpenter - Distributed Recognition Codes and Catastrophic
Forgetting
Invited Talk: Robert Jannarone - Current Trends of Learning Algorithms
8 Pattern Recognition ... T. Kohonen, B. Telfer
Invited Talk: Teuvo Kohonen - Physiological Model for the Self-Organizing
Map
Invited Talk: Brian Telfer - Challenges in Automatic Object Recognition:
Adaptivity, Wavelets, Confidence
9 Prediction and System Identification ... P. Werbos, G. Deboeck
Invited Talk: Guido Deboeck - Neural, Genetic, and Fuzzy Systems for Trading
Chaotic Financial Markets
10 Cognitive Neuroscience ... D. Alkon, D. Fong
11 Links to Cognitive Science and Artificial Intelligence ... J. Anderson, L.
Medsker
Invited Talk: Larry Medsker - Hybrid Intelligent Systems: Research and
Development Issues
12 Neural Fuzzy Systems ... L. Zadeh, B. Kosko
13 Signal Processing ... B. Widrow, H. Bouzas
Invited Talk: Bernard Widrow - Nonlinear Adaptive Signal Processing
14 Neurodynamics and Chaos ... H. Szu, M. Zaghloul
Invited Talk: Walter Freeman - Biological Neural Network Chaos
Invited Talk: Harold Szu - Artificial Neural Network Chaos
15 Hardware Implementations ... C. Lau, R. Castain, M. Sayeh
Invited Talk: Clifford Lau - Challenges in Neurocomputers
Invited Talk: Mark Holler - High Performance Classifier Chip
16 Associative Memory ... J. Taylor, S. Usui*
Invited Talk: John G. Taylor - Where is Associative Memory Going?
Invited Talk: Shiro Usui - Review of Associative Memory
17 Applications ... D. Casasent, B. Pap, D. Sobajic
Invited Talk: David Casasent - Optical Neural Networks and Applications
Invited Talk: Yoh-Han Pao - Mathematical Basis for the Power of the
Functional-Link Net Approach: Applications to Semiconductor Processing
Invited Talk: Mohammed Sayeh - Advances in Optical Neurocomputers
18 Neuroprocessing and Virtual Reality ... L. Giles, H. Hawkins
Invited Talk: Harold Hawkins - Opportunities for Virtual Environment and
Neuroprocessing
19 Circuits and System Neuroscience ... J. Dayhoff, C. Koch
Invited Talk: Judith Dayhoff - Temporal Processing for Neurobiological Signal
Processing
Invited Talk: Christof Koch - Temporal Analysis of Spike Patterns in Monkeys and
Artificial Neural Networks
20 Mathematical Foundations ... S-I. Amari, D. Levine
Invited Talk: Shun-ichi Amari - Manifolds of Neural Networks and EM Algorithms
Additional session invited talks to be determined. Session invited talks will
not be scheduled to run concurrently at WCNN 1994. *Invited
INNS wishes to acknowledge the US Office of Naval Research for its generous
support of the Biological Neural Networks Session at WCNN 1994
________________________________________________________________________
NEW IN '94! INNS UNIVERSITY SHORT COURSES
INNS is proud to announce the establishment of the INNS University Short Course
format to replace the former tutorial program. The new 2-day, 4-hour per course
format provides twice the instruction with much greater depth and detail. There
will be six parallel tracks offered in three segments (morning, afternoon, and
evening each day). INNS reserves the right to cancel Short Courses and refund
payment should registration not meet the minimum number of persons required per
Short Course. [Dates and times are listed after each instructor; course
descriptions are available by contacting INNS.]
A. Teuvo Kohonen, Helsinki University of Technology - SATURDAY, JUNE 4. 1994,
6-10 PM
Advances in the Theory and Applications of Self-Organizing Maps
B. James A. Anderson, Brown University - SATURDAY, JUNE 4. 1994, 1-5 PM
Neural Network Computation as Viewed by Cognitive Science and Neuroscience
C. Christof Koch, California Institute of Technology - SUNDAY, JUNE 5. 1994,
6-10 PM
Vision Chips: Implementing Vision Algorithms with Analog VLSI Circuits
D. Kunihiko Fukushima, Osaka University - SATURDAY, JUNE 4. 1994, 1-5 PM
Visual Pattern Recognition with Neural Networks
E. John G. Taylor, King's College London - SUNDAY, JUNE 5. 1994, 1-5 PM
Stochastic Neural Computing: From Living Neurons to Hardware
F. Harold Szu, Naval Surface Warfare Center - SATURDAY, JUNE 4. 1994, 6-10 PM
Spatiotemporal Information Processing by Means of McCollouch-Pitts and Chaotic
Neurons
G. Shun-ichi Amari, University of Tokyo - SUNDAY, JUNE 5. 1994, 8 AM-12 PM
Learning Curves, Generalization Errors and Model Selection
H. Walter J. Freeman, University of California, Berkeley - SUNDAY, JUNE 5.
1994, 8 AM-12 PM
Review of Neurobiology: From Single Neurons to Chaotic Dynamics of Cerebral
Cortex
I. Judith Dayhoff, University of Maryland - SATURDAY, JUNE 4. 1994, 8 AM-12 PM
Neurodynamics of Temporal Processing
J. Richard A. Anderson, Massachusetts Institute of Technology - SATURDAY, JUNE
4. 1994, 6-10 PM
Neurobiologically Plausible Neural Networks
K. Paul Werbos, National Science Foundation - SUNDAY, JUNE 5. 1994, 1-5 PM
>From Backpropagation to Real-Time Control
L. Bernard Widrow, Stanford University - SUNDAY, JUNE 5. 1994, 8 AM-12 PM
Adaptive Filters, Adaptive Controls,
Adaptive Neural Networks, and Applications
M. Gail Carpenter, Boston University - SUNDAY, JUNE 5. 1994, 8 AM-12 PM
Adaptive Resonance Theory
N. Takeshi Yamakawa, Kyushu Institute of Technology - SATURDAY, JUNE 4. 1994,
6-10 PM
What are the Differences and the Similarities Among Fuzzy, Neural, and Chaotic
Systems?
O. Stephen Grossberg, Boston University - SUNDAY, JUNE 5. 1994, 1-5 PM
Autonomous Neurodynamics: From Perception to Action
P. Lee Giles, NEC Research Institute - SATURDAY, JUNE 4. 1994, 8 AM-12 PM
Dynamically-Driven Recurrent Neural Networks: Models, Training Algorithms, and
Applications
Q. Alianna Maren, Accurate Automation Corporation - SATURDAY, JUNE 4. 1994, 1-5
PM
Introduction to Neural Network Applications
R. David Casasent, Carnegie Mellon University - SATURDAY, JUNE 4. 1994, 8 AM-12
PM
Pattern Recognition and Neural Networks
S. Per Bak, Brookhaven National Laboratory - SATURDAY, JUNE 4. 1994, 1-5 PM
Introduction to Self-Organized Criticality
T. Melanie Mitchell, Sante Fe Institute - SATURDAY, JUNE 4. 1994, 8 AM-12 PM
Genetic Algorithms, Theory and Applications
U. Lotfi A. Zadeh, University of California, Berkeley - SUNDAY, JUNE 5. 1994,
1-5 PM
Fuzzy Logic and Calculi of Fuzzy Rules and Fuzzy Graphs
V. Nikolay G. Rambidi, International Research Institute for Management Sciences
- - SUNDAY, JUNE 5. 1994, 6-10 PM
Image Processing and Pattern Recognition Based on Molecular Neural Networks
________________________________________________________________________PLENARIE
S
:
1. Tuesday, June 7, 1994, 6-7 PM
Lotfi A. Zadeh, University of California, Berkeley
"Fuzzy Logic, Neural Networks, and Soft Computing"
2. Tuesday, June 7, 1994, 7-8 PM
Per Bak, Brookhaven National Laboratory
"Introduction to Self-Organized Criticality"
3. Wednesday, June 8, 1994, 6-7 PM
Bernard Widrow, Stanford University
"Adaptive Inverse Control"
4. Wednesday, June 8, 1994, 7-8 PM
Melanie Mitchell, Sante Fe Institute
"Genetic Algorithms: Why They Work and What They Can Do For You"
5. Thursday, June 9, 1994, 6-7 PM
Paul Werbos, US National Science Foundation
"Brain-Like Intelligence in Artificial Models: How Do We Really Get There?"
6. Thursday, June 9, 1994, 7-8 PM
John Taylor, King's College London
"Capturing What It Is Like to Be: Modelling the Mind with Neural Networks"
________________________________________________________________________SPECIAL
SESSIONS:
Special Session 1:
"Biomedical Applications of Neural Networks"
Tuesday, June 7, 1994
Co-sponsored by the National Institute of Allergy and Infectious Diseases, U.S.
NIH; the Division of Cancer Treatment, National Cancer Institute, U.S. NIH; and
the Center for Devices and Radiological Health, U.S. Food and Drug
Administration
Chairs:
David G, Brown, PhD
Center for Devices and Radiological Health, FDA
John Weinstein, MD, PhD
National Cancer Institute, NIH
This special session will focus on recent progress in applying neural networks
to biomedical problems, both in the research laboratory and in the clinical
environment. Applications moving toward early commercial implementation will be
highlighted, and working demonstrations will be given. The track will commence
with an overview session including invited presentations by Dr. John Weinstein,
NCI, NIH on the present status of biomedical applications research and by his
co-session chair Professor Shiro Usui, Toyohasi University on biomedical
applications in Japan. A second morning session, chaired by Dr. Harry B. Burke,
MD, PhD, University of Nevada and by Dr. Judith Dayhoff, PhD, University of
Maryland, University of Maryland, will address neural networks for prediction
and other nonimaging applications. The afternoon session, chaired by Dr.
Maryellen L, Giger, PhD, University of Chicago, and Dr. Laurie J. Mango,
Neuromedical Systems, Inc., will cover biomedical image analysis/image
understanding applications. The final session, chaired by Dr. David G. Brown,
PhD, CDRH, FDA is an interactive panel/audience discussion of the promise and
pitfalls of neural network biomedical applications. Other prominent invited
speakers include Dr. Nozomu Hoshimiya of Tohoku University and Dr. Michael
O'Neill of the University of Maryland. Submission of oral and/or poster
presentations are welcomed to complement the invited presentations.
Special Session 2:
"Commercial and Industrial Applications of Neural Networks"
Tuesday, June 7, 1994
Co-sponsored by the Society for Manufacturing Engineers
Overall Chair: Bernard Widrow, Stanford University
This special session will be divided into four sessions of invited talks and
will place its emphasis on commercial and industrial applications working 24
hours a day and making money for their users. The sessions will be organized as
follows:
Morning Session 1
"Practical Applications of Neural Hardware"
Chair: Dan Hammerstrom, Adaptive Solutions, Portland, Oregon, USA
Morning Session 2
"Applications of Neural Networks in Pattern Recognition and Prediction"
Chair: Kenneth Marko, Ford Motor Company
Afternoon Session 1
"Applications of Neural Networks in the Financial Industry"
Chair: Ken Otwell, BehavHeuristics, College Park, Maryland, USA
Afternoon Session 2
"Applications of Neural Networks on Process Control and Manufacturing"
Chair: Tariq Samad, Honeywell
Special Session 3:
"Financial and Economic Applications of Neural Networks"
Wednesday, June 8, 1994
Chair: Guido J. Deboeck, World Bank
This special session will focus on the state-of the-art in financial and
economic applications. The track will be split into four sessions:
Morning Session 1
Overview on Major Financial Applications of Neural Networks and Related Advanced
Technologies"
Morning Session 2
Presentation of Papers: Time-Series, Forecasting, Genetic Algorithms, Fuzzy
Logic, Non-Linear Dynamics
Afternoon Session 1
Product Presentations
Afternoon Session 2
Panel discussion on "Cost and Benefits of Advanced Technologies in Finance"
Invited speakers to be announced. Papers submitted to regular sessions may
receive consideration for this special session.
Special Session 4:
"Neural Networks in Chemical Engineering"
Thursday, June 9, 1994
Co-sponsored by the American Institute of Chemical Engineers
Chair: Thomas McAvoy, University of Maryland
This special session on neural networks in the chemical process industries will
explore applications to all areas of the process industries including process
modelling, both steady state and dynamic, process control, fault detection, soft
sensing, sensor validation, and business examples. Contributions from both
industry and academia are being solicited.
Special Session 5:
"Mind, Brain and Consciousness"
Thursday, June 9, 1994
Session Chair: John Taylor, King's College London
Session Co-chair: Walter Freeman, University of California, Berkeley
Session Committee: Stephen Grossberg, Boston University and Gerhardt Roth,
Brain Research Institute
Invited Speakers include S. Grossberg, P. Werbos, G. Roth, B. Libet, J. Taylor
Consciousness and inner experience have suddenly emerged as the centre of
activity in psychology, philosophy, and neurobiology. Neural modelling is
preceding apace in this subject. Contributors from all areas are now coming
together to move rapidly towards a solution of what might be regarded as one of
the deepest problems of human existence. Neural models, and their constraints,
will be presented in the session, with an assessment of how far we are from
building a machine that can see the world the way we do.
________________________________________________________________________SPECIAL
INTEREST GROUP (SIGINNS) SESSIONS
INNS Special Interest Groups have been established for interaction between
individuals with interests in various subfields of neural networks as well as
within geographic areas. Several SIGs have tentatively planned sessions for
Wednesday, June 8, 1994 from 8 - 9:30 PM:
Automatic Target Recognition - Brian Telfer, Chair
A U.S Civilian Neurocomputing Initiative - Andras Pellionisz, Chair
Control, Robotics, and Automation - Kaveh Ashenayi, Chair
Electronics/VLSI - Ralph Castain, Chair
Higher Level Cognitive Processes - John Barnden, Chair
Hybrid Intelligence - Larry Medsker, Chair
Mental Function and Dysfunction - Daniel Levine, Chair
Midwest US Area - Cihan Dagli, Chair
Power Engineering - Dejan Sobajic, Chair
________________________________________________________________________NEW in
'94!
NEURAL NETWORK INDUSTRIAL EXPOSITION
The State-of-the-Art in Advanced Technological Applications
MONDAY JUNE 6, 1994 - 8 AM to 9 PM
SPECIAL EXPOSITION-ONLY REGISTRATION AVAILABLE: $55
* Dedicated to Currently Available Commercial Applications of Neural Nets &
Related Technologies
* Commercial Hardware and Software Product Demos
* Poster Presentations
* Panel Conclusions Led by Industry Experts
* Funding Panel
EXPOSITION CHAIR -- Takeshi Yamakawa, Kyushu Institute of Technology
CHAIRS --
Hardware:
Dan Hammerstrom, PhD, Adaptive Solutions, Inc.
Takeshi Yamakawa, Kyushu Institute of Technology
Robert Pap, Accurate Automation Corporation
Software:
Dr. Robert Hecht-Nielsen, HNC, Inc.
Casimir C. Klimasauskas, Co-founder, NeuralWare, Inc.
John Sutherland, Chairman and VP Research, AND America, Ltd.
Soo-Young Lee, KAIST
Asain Liaison
Pierre Martineau, Martineau and Associates
European Liaison
Plus:
NEURAL NETWORK CONTEST
with $1500 GRAND PRIZE
chaired by Bernard Widrow, Harold Szu, and Lotfi Zadeh
with a panel of distinguished judges!
EXPOSITION SCHEDULE
Morning Session: Hardware
8-11 AM Product Demonstration Area and Poster Presentations
11 AM-12 PM Panel Conclusion
Afternoon Session: Software
1-4 PM Product Demonstration Area and Poster Presentations
4-5 PM Panel Conclusion
Evening Session
6-8 PM Neural Network Contest
8-9 PM Funding Panel
HOW TO PARTICIPATE: To demonstrate your hardware or software product contact
James J. Wesolowski, 202-466-4667; (fax) 202-466-2888. For more information on
the neural network contest, indicate your interest on the registration form.
Further information and contest rules will be sent to all interested parties!
Deadline for contest registration is March 1, 1994.
_______________________________________________________________________
FEES at WCNN 1994
REGISTRATION FEE (includes all sessions, plenaries, proceedings, reception, and
Industrial Exposition. Separate registration for Short Courses.)
- - INNS Members: US$195 - US$395
- - Non Members: US$295 - US$495
- - Full Time Students: US$85 - US$135
- - Spouse/Guest: US$35 - US$55
SHORT COURSE FEE (Pay for 2 short courses, get the third FREE)
- - INNS Members: US$225 - US$275
- - Non Members: US$275 - US$325
- - Full Time Students: US$125 - US$150
Neural Network Industrial Exposition ONLY: US$55
CONFERENCE HOTEL: Town and Country Hotel (same site as conference)
- - Single: US$70 - US$95
- - Double: US$80 - US$105
TRAVEL RESERVATIONS: Executive Travel Associates (ETA) has been selected the
official travel company for the World Congress on Neural Networks. ETA offers
the lowest available fares on any airline at time of booking when you contact
them at US phone number 202-828-3501 or toll free (in the US) at 800-562-0189
and identify yourself as a participant n the Congress. Flights booked on
American Airlines, the official airline for this meeting, will result in an
additional discount. Please provide the booking agent you use with the code:
Star #S0464FS
________________________________________________________________________ TO
RECEIVE CONFERENCE BROCHURES AND REGISTRATION FORMS, HOTEL ACCOMMODATION FORMS,
AND FURTHER CONGRESS INFORMATION, CONTACT THE INTERNATIONAL NEURAL NETWORK
SOCIETY AT:
International Neural Network Society
1250 24th Street, NW
Suite 300
Washington, DC 20037 USA
phone: 202-466-4667
fax: 202-466-2888
e-mail: 70712.3265@compuserve.com
________________________________________________________________________
HELP INNS SPREAD THE WORD ABOUT THE 1994 WORLD CONGRESS ON NEURAL NETWORKS.
PLEASE COPY AND POST THIS ANNOUNCEMENT ON OTHER BULLETIN BOARDS AND LISTS WHICH
HAVE AN AUDIENCE THAT YOU FEEL SHOULD KNOW ABOUT WCNN 1994! YOUR EFFORTS ARE
GREATLY APPRECIATED!
------------------------------
End of Neuron Digest [Volume 12 Issue 19]
*****************************************