home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Hacker Chronicles 2
/
HACKER2.BIN
/
349.NEURON
< prev
next >
Wrap
Text File
|
1993-03-01
|
115KB
|
2,699 lines
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
6577; Fri, 19 Feb 93 01:31:27 EST
Received: from noc2.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Fri, 19 Feb 93 01:31:24 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc2.dccs.upenn.edu
id AA16068; Fri, 19 Feb 93 01:28:48 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA28888; Fri, 19 Feb 93 00:23:09 EST
Posted-Date: Fri, 19 Feb 93 00:22:32 EST
From: "Neuron-Digest Moderator" <neuron-request@cattell.psych.upenn.edu>
To: Neuron-Distribution:;
Subject: Neuron Digest V11 #12 (mailing lists & lots of discussion)
Reply-To: "Neuron-Request" <neuron-request@cattell.psych.upenn.edu>
X-Errors-To: "Neuron-Request" <neuron-request@cattell.psych.upenn.edu>
Organization: University of Pennsylvania
Date: Fri, 19 Feb 93 00:22:32 EST
Message-Id: <28852.730099352@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Friday, 19 Feb 1993
Volume 11 : Issue 12
Today's Topics:
BIOSCI/bionet Frequently Asked Questions
mailing list for cognitive neuroscientists
NATO ASI: March 5 Deadline Approaching
VLSI NN
Re: "neural-net based software for digitization of maps"
Applications for particle track segment detection??
Pattern Recognition
Re: chip design
neural net applications to fixed-income security markets
Room sharing for ICNN
pattern recognition (pratical database considerations) ?
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: BIOSCI/bionet Frequently Asked Questions
From: Dave Kristofferson <kristoff@net.bio.net>
[[ Editor's Note: This contains the information about the Neuroscience
list which many of you readers have asked about. The entire file is
about 48K long, so I've heavily edited the following with enough
information about the Neuroscience list for you. If you (or colleagues)
are biologically oriented (e.g., genetic sequencing, molecular biology,
tropical ecology, etc), I highly recommend the BIOSCI/bionet resources.
Consider browsing through the newsgroups or archives. -PM ]]
BIOSCI/bionet Frequently Asked Questions (FAQ)
----------------------------------------------
(last revised - 1/15/93)
This document describes the general purpose and uses of the
BIOSCI/bionet newsgroups and provides details on how to participate in
these forums. It is available for anonymous FTP from net.bio.net
[134.172.2.69] in pub/BIOSCI/biosci.FAQ. This document may also be
requested by e-mail to biosci@net.bio.net (use plain English - this is
not a server address). It is posted the first of each month to the
BIONEWS/bionet.announce newsgroup along with the BIOSCI information
sheet and the list of changes to the newsgroups during the preceding
month. The FAQ is also posted monthly to the USENET newsgroup
news.answers and is archived along with other USENET newsgroup FAQs at
pit-manager.mit.edu [18.172.1.27].
[[...]]
Dissemination is by normal electronic mail and also over USENET in the
form of the "bionet" newsgroups (see below for USENET details). The
contents of the electronic mail distribution is identical to the USENET
news distribution, but we encourage BIOSCI users to access the system
through USENET news software whenever possible. E-mail distributions may
eventually be phased out. As of October 1992, 59% of our readers used
USENET news software instead of e-mail.
[[...]]
Two versions of the BIOSCI info sheet are available, one for the Americas
and the Pacific Rim countries, and the second for Europe, Africa, and
Central Asia. The former may be requested by e-mail to
biosci@net.bio.net, while the latter may be requested from
biosci@daresbury.ac.uk.
[[...]]
How do I request or cancel e-mail subscriptions to BIOSCI newsgroups?
- ---------------------------------------------------------------------
If you have access to USENET news software, then YOU DO NOT NEED AN
E-MAIL SUBSCRIPTION! Only those people who need to receive postings
by e-mail must request to be added to the mailing lists. USENET users
can simply read the various bionet newsgroups using their news
software. If your site has USENET news but does not get the bionet
newsgroups, please request help by sending a message to
biosci@net.bio.net.
For those who need e-mail subscriptions or who want to cancel current
e-mail subscriptions, please send a request to one of the following
addresses. Please choose the site that serves your location. Simply
pick the newsgroup(s) from the list above that you wish to subscribe
to and request that your address be added to the chosen mailing lists.
Please use plain English; no special message syntax is required in
your subscription or cancellation request.
Address Serving
- ------- -------
biosci@net.bio.net The Americas and Pacific Rim
biosci@daresbury.ac.uk Europe, Africa, and Central Asia
****If you are changing e-mail addresses****, please be sure to send a
message to your appropriate biosci address above and request that your
subscriptions be changed or canceled!!
How can I get a list of newsgroups or my subscriptions?
- -------------------------------------------------------
As with any other subscription correspondence, simply send a request
to your appropriate BIOSCI distribution site:
Address Serving
- ------- -------
biosci@net.bio.net The Americas and Pacific Rim
biosci@daresbury.ac.uk Europe, Africa, and Central Asia
The most recent list of BIOSCI newsgroups/mailing addresses and the
latest revision of the BIOSCI/bionet FAQ are posted the first of each
month on the BIONEWS/bionet.announce newsgroup. You should save these
postings for future reference.
[[...]]
MAILING LIST NAME USENET Newsgroup Name
- ----------------- ---------------------
AGEING bionet.molbio.ageing
AGROFORESTRY bionet.agroforestry
ARABIDOPSIS bionet.genome.arabidopsis
BIOFORUM bionet.general
BIO-INFORMATION-THEORY + bionet.info-theory
BIONAUTS bionet.users.addresses
BIONEWS ** bionet.announce
BIO-JOURNALS bionet.journals.contents
BIO-MATRIX bionet.molbio.bio-matrix
BIO-SOFTWARE bionet.software
CHROMOSOME-22 bionet.genome.chrom22
COMPUTATIONAL-BIOLOGY ** bionet.biology.computational
EMBL-DATABANK bionet.molbio.embldatabank
EMPLOYMENT bionet.jobs
GDB bionet.molbio.gdb
GENBANK-BB bionet.molbio.genbank
GENETIC-LINKAGE bionet.molbio.gene-linkage
HIV-MOLECULAR-BIOLOGY bionet.molbio.hiv
HUMAN-GENOME-PROGRAM bionet.molbio.genome-program
IMMUNOLOGY bionet.immunology
JOURNAL-NOTES bionet.journals.note
METHODS-AND-REAGENTS bionet.molbio.methds-reagnts
MOLECULAR-EVOLUTION bionet.molbio.evolution
NEUROSCIENCE bionet.neuroscience
PLANT-BIOLOGY bionet.plants
POPULATION-BIOLOGY bionet.population-bio
PROTEIN-ANALYSIS bionet.molbio.proteins
PROTEIN-CRYSTALLOGRAPHY bionet.xtallography
SCIENCE-RESOURCES bionet.sci-resources
TROPICAL-BIOLOGY bionet.biology.tropical
VIROLOGY bionet.virology
WOMEN-IN-BIOLOGY bionet.women-in-bio
+ full name is BIOLOGICAL-INFORMATION-THEORY-AND-CHOWDER-SOCIETY
** Note that newsgroups flagged with ** are moderated, i.e., postings
are directed to a moderator (editor) who later forwards messages
(possibly edited or condensed) to the newsgroup.
NEWSGROUP NAME TOPIC
- -------------- -----
AGEING Discussions about ageing research
AGROFORESTRY Discussions about agroforestry research
ARABIDOPSIS Newsgroup for the Arabidopsis Genome Project
BIOFORUM Discussions about biological topics for
which there is not yet a dedicated newsgroup
BIOLOGICAL-INFORMATION-
THEORY-AND-CHOWDER-SOCIETY Applications of information theory to biology
BIONAUTS Question/answer forum for help using
electronic networks, locating e-mail
addresses, etc.
BIONEWS ** General announcements of widespread
interest to biologists
BIO-JOURNALS Tables of Contents of biological journals
BIO-MATRIX Applications of computers to biological databases
BIO-SOFTWARE Information on software for the biological
sciences
CHROMOSOME-22 Mapping and Sequencing of Human Chromosome 22
COMPUTATIONAL-BIOLOGY ** Mathematical and computer applications in biology
EMBL-DATABANK Messages to and from the EMBL database staff
EMPLOYMENT Job opportunities
GDB Messages to and from the Genome Data Bank staff
GENBANK-BB Messages to and from the GenBank database staff
GENETIC-LINKAGE Newsgroup for genetic linkage analysis
HIV-MOLECULAR-BIOLOGY Discussions about the molecular biology of HIV
HUMAN-GENOME-PROGRAM NIH-sponsored newsgroup on human genome issues
IMMUNOLOGY Discussions about research in immunology
JOURNAL-NOTES Practical advice on dealing with professional
journals
METHODS-AND-REAGENTS Requests for information and lab reagents
MOLECULAR-EVOLUTION Discussions about research in molecular evolution
NEUROSCIENCE Discussions about research in the neurosciences
PLANT-BIOLOGY Discussions about research in plant biology
POPULATION-BIOLOGY Discussions about research in population biology
PROTEIN-ANALYSIS Discussions about research on proteins and
messages for the PIR and SWISS-PROT databank
staffs.
PROTEIN-CRYSTALLOGRAPHY Discussion about crystallography of macromolecules
and messages for the PDB staff
SCIENCE-RESOURCES Information from/about scientific funding
agencies
TROPICAL-BIOLOGY Discussions about research in tropical biology
VIROLOGY Discussions about research in virology
WOMEN-IN-BIOLOGY Discussions about issues concerning women
biologists
** Note that newsgroups flagged with ** are moderated, i.e., postings
are directed to a moderator (editor) who later forwards messages
(possibly edited or condensed) to the newsgroup.
------------------------------
Subject: mailing list for cognitive neuroscientists
From: kpc@pluto.arc.nasa.gov (k p c)
Organization: NASA Ames Research Center AI Research Branch; Sterling.
Date: 27 Oct 92 01:26:08 +0000
[[ Editor's Note: Here is another mailing list of interest to some. I
attempted to joi, but have received nothing. I sent a new message a
couple of days ago, still without reply. I can therefore not personally
vouch for this list's existance. -PM ]]
ANNOUNCEMENT OF THE COGNEURO (COGNITIVE NEUROSCIENCE) MAILING LIST
SUBJECT
this list is an informal, intentionally low-volume way to discuss
matters at the interface of cognitive science and neuroscience.
the discussion will be scientific and academic, covering biological aspects
of behavior and cognitive issues in neuroscience. also discussable are
curricula, graduate programs, and jobs in the field.
HOW TO USE THE LIST
please follow these examples exactly, so that my software works.
to SUBSCRIBE, send mail like this.
To: cogneuro-request@ptolemy.arc.nasa.gov
Subject: cogneuro: subscribe
to UNSUBSCRIBE, send mail like this.
To: cogneuro-request@ptolemy.arc.nasa.gov
Subject: cogneuro: unsubscribe
you don't need to put anything in the body of the message. there will be
no automatic confirmation, but you might get a note from me.
to CHANGE YOUR EMAIL ADDRESS (also very polite to do if you know that your
MACHINE WILL GO DOWN for a while, or in case you LEAVE THE NET) simply
unsubscribe from your old address and resubscribe from your new address.
this prevents error messages and prevents me from having to verify your
address manually.
to POST (send a message to everybody on the list), send mail to
cogneuro@ptolemy.arc.nasa.gov, or followup to an existing message.
e.g.
To: cogneuro@ptolemy.arc.nasa.gov
Subject: corpus callosum
to ask a METAQUESTION, send it to cogneuro-request@ptolemy.arc.nasa.gov.
suggestions for improving this announcement or the list are welcome.
GUIDELINES
the language of the list is english.
the list is meant to be low in volume and high in s/n ratio. since
cogneuro is a huge field, submissions shouldn't be too off-topic or
otherwise not essentially scientific or academic.
controversy and speculation are welcome, as are lack of controversy and
rigor. since the emphasis is scientific and academic, participants are
expected to be extremely tolerant of other participants' opinions and
choice of words.
the list is initially open to anybody who is interested. although i don't
expect ever to need to exercise it, i reserve the right to remove anybody
from the list if there are problems. i want to keep a spirit of free
exchange of cognitive neuroscience.
other than this, the list is unmoderated and informal.
------------------------------
Subject: NATO ASI: March 5 Deadline Approaching
From: John Moody <moody@chianti.cse.ogi.edu>
Date: Thu, 04 Feb 93 17:38:08 -0800
As the March 5th application deadline is now four weeks away, I am
posting this notice again.
NATO Advanced Studies Institute (ASI) on
Statistics and Neural Networks
June 21 - July 2, 1993, Les Arcs, France
Directors:
Professor Vladimir Cherkassky, Department of Electrical Eng., University of
Minnesota, Minneapolis, MN 55455, tel.(612)625-9597, fax (612)625-
4583, email cherkass@ee.umn.edu
Professor Jerome H. Friedman, Statistics Department, Stanford University,
Stanford, CA 94309 tel(415)723-9329, fax(415)926-3329, email
jhf@playfair.stanford.edu
Professor Harry Wechsler, Computer Science Department, George Mason
University, Fairfax VA22030, tel(703)993-1533, fax(703)993-1521, email
wechsler@gmuvax2.gmu.edu
List of invited lecturers: I. Alexander, L. Almeida, A. Barron, A. Buja,
E. Bienenstock, G. Carpenter, V. Cherkassky, T. Hastie, F. Fogelman, J.
Friedman, H. Freeman, F. Girosi, S. Grossberg, J. Kittler, R. Lippmann,
J. Moody, G. Palm, R. Tibshirani, H. Wechsler, C. Wellekens
Objective, Agenda and Participants: Nonparametric estimation is a problem
of fundamental importance for many applications involving pattern
classification and discrimination. This problem has been addressed in
Statistics, Pattern Recognition, Chaotic Systems Theory, and more
recently in Artificial Neural Network (ANN) research. This ASI will bring
together leading researchers from these fields to present an up-to-date
review of the current state-of-the art, to identify fundamental concepts
and trends for future development, to assess the relative advantages and
limitations of statistical vs neural network techniques for various
pattern recognition applications, and to develop a coherent framework for
the joint study of Statistics and ANNs. Topics range from theoretical
modeling and adaptive computational methods to empirical comparisons
between statistical and neural network techniques. Lectures will be
presented in a tutorial manner to benefit the participants of ASI. A
two-week programme is planned, complete with lectures,
industrial/government sessions, poster sessions and social events. It is
expected that over seventy students (which can be researchers or
practitioners at the post-graduate or graduate level) will attend, drawn
from each NATO country and from Central and Eastern Europe. The
proceedings of ASI will be published by Springer-Verlag.
Applications: Applications for participation at the ASI are sought.
Prospective students, industrial or government participants should send a
brief statement of what they intend to accomplish and what form their
participation would take. Each application should include a curriculum
vitae, with a brief summary of relevant scientific or professional
accomplishments, and a documented statement of financial need (if funds
are applied for). Optionally, applications may include a one page
summary for making a short presentation at the poster session. Poster
presentations focusing on comparative evaluation of statistical and
neural network methods and application studies are especially sought. For
junior applicants, support letters from senior members of the
professional community familiar with the applicant's work would
strengthen the application. Prospective participants from Greece,
Portugal and Turkey are especially encouraged to apply.
Costs and Funding: The estimated cost of hotel accommodations and meals
for the two-week duration of the ASI is US$1,600. In addition,
participants from industry will be charged an industrial registration
fee, not to exceed US$1,000. Participants representing industrial
sponsors will be exempt from the fee. We intend to subsidize costs of
participants to the maximum extent possible by available funding.
Prospective participants should also seek support from their national
scientific funding agencies. The agencies, such as the American NSF or
the German DFG, may provide some ASI travel funds upon the recommendation
of an ASI director. Additional funds exist for students from Greece,
Portugal and Turkey. We are also seeking additional sponsorship of ASI.
Every sponsor will be fully acknowledged at the ASI site as well as in
the printed proceedings.
Correspondence and Registration: Applications should be forwarded to
Dr. Cherkassky at the above address. Applications arriving after March 5,
1993 may not be considered. All approved applicants will be informed of the
exact registration arrangements. Informal email inquiries can be addressed to
Dr. Cherkassky at nato_asi@ee.umn.edu
------------------------------
Subject: VLSI NN
From: Gasser Auda <gasser@cs.uregina.ca>
Date: Sat, 06 Feb 93 13:30:14 -0600
[[ Editor's Note: I'm starting to reject these vague general calls for
help, due to the increased volume of submissions to the Digest. However,
I don't want to shut out the rank beginners from this exciting field
either. So, I hope faithful readers will take pity on those who are just
starting out with some pointers. -PM ]]
Dear neural networkers,
I'm performing a survey on NN solutions of the handwriting
problems, especialy VLSI implemented solutions.
If you have any advise, information, paper, or commercial
product, just email me at:
gasser@cs.uregina.ca
THANKS IN ADVANCE.
gasser
------------------------------
Subject: Re: "neural-net based software for digitization of maps"
From: eytan@dpt-info.u-strasbg.fr (Michel Eytan, LILoL)
Date: Sun, 07 Feb 93 11:24:26 +0100
>A colleague from India is asking me if there is any effort on "neural
>network based software for the digitization of maps and photos (maps and
>photos presumably obtained from remotely-sensed data)" Is there any one
>who can help answer this question? Thanks
>
>Rao Vemuri
>Dept. of Applied Science
>UC Davis, Livermore, CA
>vemuri@icdc.llnl.gov
My colleague J. Korczak <korczak@dpt-info.u-strasbg.fr> has a student, Ms.
Hamadi, doing a Ph. D. on a similar subject. Hope this helps.
Michel Eytan, U. Strasbourg II eytan@dpt-info.u-strasbg.fr
Lab Info, Log & Lang, Dpt. Info V: +33 88 41 74 29
22 rue Descartes, 67084 Strasbourg F: +33 88 41 74 40
------------------------------
Subject: Applications for particle track segment detection??
From: "Martin J. Dudziak" <DUDZIAK@vms.cis.pitt.edu>
Date: Sun, 07 Feb 93 14:09:00 -0500
I am involved in a joint collaboration involving U-Pitt and the Institute
of Nuclear Physics, Novosibirsk (Russia) wherein we are studying the
possible useful applications of neural networks for particle track
segment detection and track reconstruction. The process is now handled
by some effective and well-established algorithms developed at CERN,
Novosibirsk and elsewhere, but we believe that neural nets may have
applicability for both improved detection (picking up on track segments
registered in the drift chamber portion of the detector) and performance,
particularly as the data rates in the detection process will be quite
high (@40-100MBytes/sec.) in the next generation accelerator and we are
working with a parallel processing architecture already to meet that
computing requirement.
We are aware of some work in the high energy physics community involving
neural networks, most of which is more concerned with classification of
particle interactions - the step that follows after one has identified
tracks of interest. Much of that work has also not involved the
low-level end of the detection process such as we are concerned about,
nor has it involved the density of tracks.
I thought it would be useful to inquire through the neural network
community in grapevines such as Neuron-Digest to see if people are aware
of projects applying neural nets to problems such as the above. Any
information on NNs in HEP and for track detection/reconstruction in
particular will be greatly appreciated.
Whatever is sent will be compiled into a bibliography and redistributed
to those who are interested.
Martin Dudziak
c/o J. Thompson, School of Physics, Univ. of Pittsburgh
dudziak@vms.cis.pitt.edu
Thank you!
------------------------------
Subject: Pattern Recognition
From: VOGLINOG@AGR04.ST.IT
Date: Mon, 08 Feb 93 11:47:55 +0700
[[ Editor's Note: Again, a rather general and vague request. Perhaps
someone from Italy on this mailing list could help out a fellow
countryperson? -PM ]]
Hello.
Where to get some software on pattern recognition based on neural
network ?
Thank you.
GIuseppe Voglino
SGS-Thomson Microelectronics Neural Network Group R&D
Agrate B. _ MILAN - ITALY
------------------------------
Subject: Re: chip design
From: blam@Engn.Uwindsor.Ca (B Lam)
Date: Mon, 08 Feb 93 13:06:16 -0500
[[ Editor's Note: This plucky person rephrased his question from "Tell me
about neural nets" to the follow (somewhat more specific) question...
after I rejected his first attempt. My public reply to the specifics of
his questio would be to look at Carver Mead's book (MIT Press?) "Analog
VLSI" to start, but I don't know whether it covers fuzzy stuff. -PM ]]
I'm looking the analog VLSI design for the Neural Network. I'm thinking
to combine Fuzzy into NN. Is there any paper that talk about this topic?
I'd look the journal of IEEE Tran. on NN but it seems to talk about the
theory or software only. So please help me with this one. Thanks!
------------------------------
Subject: neural net applications to fixed-income security markets
From: danlap@internet.sbi.com (Dan LaPushin)
Date: Mon, 08 Feb 93 18:26:43 -0500
To The Editor,
I am new to the field of neural networks but have a strong background
in mathematics, economics, and some computer programming. I work at
a large Wall St. firm and am interested in applying neural network
technology to the field of fixed-income research. Such instruments
include bonds, mortgage-backed securities and the like. There seems
to be, as far as I can tell, little research into neural net
application to such markets. I suspect this is because the data is
hard to come by for those not in the field, but I'm not sure. Could
you direct me to any research in this area so that I don't
inadvertently recreate the wheel? Thanks for your help!
Dan LaPushin
I'm on your mailing list as danlap@sp_server.sbi.com
------------------------------
Subject: Room sharing for ICNN
From: nanda@ogicse.cse.ogi.edu (Nandakishore Kambhatla)
Organization: Oregon Graduate Institute (formerly OGC), Beaverton, OR
Date: 10 Feb 93 18:02:30 +0000
Hi,
I am a male graduate student attending ICNN'93 at SF. I am looking for
roommates (1) to share a hotel room with at San Fransisco for ICNN'93.
The conference runs from Mar 29th to April 1st. Please respond thru
email in case you are interested.
- -Nanda
Nandakishore Kambhatla
email: nanda@cse.ogi.edu
phone: daytime-(503)-690-1121 (Extn-7051)
evenings-(503)-646-5777
------------------------------
Subject: pattern recognition (pratical database considerations) ?
From: "Duane A. White" <KFRAMXX%TAIVM2.BITNET@TAIVM1.taiu.edu>
Date: Sat, 13 Feb 93 03:45:41 -0600
I am interested in pattern recognition. In my particular application I
would like to compare a 2D monochrome bitmap image with those in a
database. The program should determine if there is a match, and if not
then add it to the database. Most of the literature I've read on pattern
matching networks use a relatively small set of classification patterns
(such as letters of the alphabet, numbers). In my case it wouldn't seem
practical to train a single network to identify every entry in the
database (on the order of hundreds or thousands of entries). Is there
something fundemental in the approach that I'm missing?
Also the program should to a small degree be rotation and translation
invariant.
------------------------------
End of Neuron Digest [Volume 11 Issue 12]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
6666; Tue, 23 Feb 93 12:54:11 EST
Received: from noc2.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Tue, 23 Feb 93 12:54:08 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc2.dccs.upenn.edu
id AA03416; Tue, 23 Feb 93 12:44:05 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA04331; Tue, 23 Feb 93 10:57:09 EST
Posted-Date: Tue, 23 Feb 93 10:56:33 EST
From: "Neuron-Digest Moderator" <neuron-request@cattell.psych.upenn.edu>
To: Neuron-Distribution:;
Subject: Neuron Digest V11 #13 (discussion)
Reply-To: "Neuron-Request" <neuron-request@cattell.psych.upenn.edu>
X-Errors-To: "Neuron-Request" <neuron-request@cattell.psych.upenn.edu>
Organization: University of Pennsylvania
Date: Tue, 23 Feb 93 10:56:33 EST
Message-Id: <4321.730482993@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Tuesday, 23 Feb 1993
Volume 11 : Issue 13
Today's Topics:
Biologically Plausible Dynamic Artificial Neural Networks Reviewed
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Biologically Plausible Dynamic Artificial Neural Networks Reviewed
From: Paul Fawcett <paulf@manor.demon.co.uk>
Date: Sat, 06 Feb 93 18:57:41 +0000
Thank you for publishing in Neuron Digest Vol. 11, Issue 6 my original
posting on the subject of 'Biologically Plausible Dynamic Artificial
Networks'.
As a follow-up to this post I circulated a summary of replies to those who
contacted me directly by email. Each contributor agreed to their comments
being quoted in this way. Please feel free to use this material, in full
or in part, for any future edition of the Neuron Digest.
The summary follows:
Last update: 20 JAN 93
DISTRIBUTION:
Ulf Andrick <andrick@rhrk.uni-kl.de>
Mark J. Crosbie <mcrosbie@unix1.tcd.ie>
S. E. Fahlman <sef@sef-pmax.slisp.cs.cmu.edu>
Bernie French <btf64@cas.org>
John R. Mcdonnell <mcdonn%bach.nosc.mil@nosc.mil>
Larry D. Pyeatt <pyeatt@texaco.com>
Bill Saidel <saidel@clam.rutgers.edu>
Mark W. Tilden <mwtilden@math.uwaterloo.ca>
Jari Vaario <jari@ai.rcast.u-tokyo.ac.jp>
Paul Verschure <verschur@ifi.unizh.ch>
Stanley Zietz <szietz@king.mcs.drexel.edu>
THIS BULLETIN SUMMARIZES THE MAJORITY OF E-MAIL REPLIES TO THE FOLLOWING
USENET ARTICLE:
All contributors have agreed to publication of their comments.
From: paulf@manor.demon.co.uk (Paul Fawcett)
Newsgroups: comp.ai,
comp.ai.neural-nets,
sci.cognitive,
comp.theory.cell-automata,
bionet.neuroscience,
bionet.molbio.evolution,
bionet.software
Subject: Biologically Plausible Dynamic Artificial Neural Networks
Date: Tue, 05 Jan 93 05:53:57 GMT
Biologically Plausible Dynamic Artificial Neural Networks.
------------------------------------------------------------
A *Dynamic Artificial Neural Network* (DANN) [1]
[my acronym] possesses processing elements that are
created and/or annihilated, either in real time or as
some part of a development phase [2].
Of particular interest is the possibility of
constructing *biologically plausible* DANN's that
models developmental neurobiological strategies for
establishing and modifying processing elements and their
connections.
Work with cellular automata in modeling cell genesis and
cell pattern formation could be applicable to the design
of DANN topologies. Likewise, biological features that are
determined by genetic or evolutionary factors [3] would
also have a role to play.
Putting all this together with a view to constructing a
working DANN, possessing cognitive/behavioral attributes of
a biological system is a tall order; the modeling of nervous
systems in simple organisms may be the best approach when
dealing with a problem of such complexity [4].
Any comments, opinions or references in respect of the
above assertions would be most welcome.
References.
1. Ross, M. D., et al (1990); Toward Modeling a Dynamic
Biological Neural Network, Mathl Comput. Modeling,
Vol 13 No.7, pp97-105.
2. Lee, Tsu-Chang,(1991); Structure Level Adaptation for
Artificial Neural Networks, Kluwer Academic Publishers.
3. Edleman, Gerald,(1987); Neural Darwinism the Theory of
Neural Group Selection, Basic Books.
4. Beer, Randal, D,(1990); Intelligence as Adaptive Behavior
: An Experiment in Computational Neuroethology.
Academic Press.
OVERVIEW
The cross-posting strategy was successful in bringing about replies from
contributors working in the computer and life sciences.
Of those with a computer science background Jari Vaario is constructing
(and publishing) DANN's inspired by biological systems and an evolutionary
metaphor, John Mcdonnell also reports he has been able to construct
evolving networks and Larry Pyeatt has had some success at modeling these
processes. Mark Crossbie is interested in combining cellular automata and
genetic algorithms to build simple machines. Mark Tilden offers a
robotics viewpoint and suggests there may be some 'simple' and 'elegant'
solutions. Paul Verschure offers some references to his own work in this
field.
Life science replies came from Bernie French who suggested the nematode
as a suitable organism as model for a DANN. Stanley Zietz has suggested
modeling simple structures rather than organisms, and draws attention to
the work at NASA-Ames where they are attempting to make 'real' neural
networks. To counter this Bill Saidel points out the distinction between
the type of neuron being studied in NASA-Ames research and those in the
cortex.
Thanks to Ulf Andrick and Scott Fahlman for their replies on the Usenet.
I hope this discussion will initiate further debate and those who
participated will take the opportunity to make some informal contacts with
the other contributors.
Many thanks
Paul.
c/o AI Division
School of Computer Science
University of Westminster
115 New Cavendish Street
London W1M 8JS
UK.
-------------------------------------------------------------
READING
The Wolpert and Brown references are suitable for those who do not have a
strong background in biology. I would describe Langton's two Artificial
Life volumes as essential reading. Artificial Life II contains several
papers exploring evolving artificial neural networks. Beer's book is so
new I have not been able to look at a copy yet.
Title: Biological neural networks in invertebrate neuroethology and
robotics / edited by Randall D. Beer, Roy E. Ritzmann, Thomas McKenna.
Publication Info: Boston : Academic Press, c1993. Phys. Description: xi,
417 p. : ill. ; 24 cm. Series Name: Neural networks, foundations to
applications
Subjects: Neural circuitry. Subjects: Invertebrates--Physiology. Subjects:
Neural networks (Computer science)
ISBN: 0-12-084728-0
Author: Wolpert, L. (Lewis)
Title: The triumph of the embryo / Lewis Wolpert ; with illustrations
drawn by Debra Skinner.
Publication Info: Oxford [England] ; New York : Oxford University Press,
1991.
Phys. Description: vii, 211 p. : ill. ; 25 cm.
Subjects: Embryology, Human--Popular works.
ISBN: 0-19-854243-7 : $22.95
Author: Brown, M. C. (Michael Charles)
Title: Essentials of neural development / M.C. Brown, W.G. Hopkins, and
R.J. Keynes.
Publication Info: Cambridge ; New York : Cambridge University Press,
c1991. Phys. Description: x, 176 p. : ill. ; 24 cm.
Notes: Rev. ed. of: Development of nerve cells and their connections / by
W.G. Hopkins and M.C. Brown. 1984.
Subjects: Developmental neurology. Subjects: Nerves. Subjects: Neurons.
Subjects: Nerve endings. Subjects: Neurons.
ISBN: 0-521-37556-8
ISBN: 0-521-37698-X (pbk.)
Title: Artificial life II : the proceedings of an interdisciplinary
workshop on the synthesis and simulation of living systems held 1990 in
Los Alamos, New Mexico / edited by Christopher G. Langton ... [et al.].
Publication Info: Redwood City, Calif. : Addison-Wesley, 1991. Series
Name: Santa Fe Institute studies in the sciences of complexity proceedings
; v. 10
Notes: Proceedings of the Second Artificial Life Workshop.
Subjects: Biological systems--Computer simulation--Congresses. Subjects:
Biological systems--Simulation methods--Congresses.
ISBN: 0-201-52570-4
ISBN: 0-201-52571-2 (pbk.)
Author: Interdisciplinary Workshop on the Synthesis and Simulation of
Living Systems (1987 : (Los Alamos, N.M.)
Title: Artificial life : the proceedings of an Interdisciplinary Workshop
on the Synthesis and Simulation of Living Systems, held September, 1987,
in Los Alamos, New Mexico / Christopher G. Langton, editor.
Publication Info: Redwood City, Calif. : Addison-Wesley Pub. Co., Advanced
Book Program, c1989. Phys. Description: xxix, 655 p., [10] p. of plates :
ill. (some col.) ; 25 cm.
Series Name: Santa Fe Institute studies in the sciences of complexity ; v.
6
ISBN: 0-201-09346-4
ISBN: 0-201-09356-1 (pbk.)
----------------------------------------------------------------------
From: Jari Vaario <jari@ai.rcast.u-tokyo.ac.jp>
I have been working already awhile to create a method to describe
dynamical neural networks as you describe above. The latest journal papers
of mine are
Jari Vaario and Setsuo Ohsuga: An Emergent Construction of Adaptive Neural
Architectures, Heuristics - The Journal of Knowledge Engineering, vol. 5,
No 2, 1992.
Abstract:
In this paper a modeling method for an emergent construction
of neural network architectures is proposed. The structure
and behavior of neural networks are an emergent of small
construction and behavior rules, that in cooperation with
extrinsic signals, define the gradual growth of the neural
network structure and its adaptation capability (learning
behavior). The inspiration for the work is taken directly
from biological systems, even though the simulation itself
is not an exact biological simulation. The example used to
demonstrate the modeling method is also from biological
context: the sea hare Aplysia (a mollusk).
Jari Vaario, Koichi Hori and Setsuo Ohsuga: Toward Evolutionary Design of
Autonomous Systems, The International Journal in Computer Simulation, A
Special Issue on High Autonomous Systems, to be appear 1993.
Abstract:
An evolutionary method for designing autonomous systems is
proposed. The research is a computer exploration on how the
global behavior of autonomous systems can emerge from neural
circuits. The evolutionary approach is used to increase the
repertoire of behaviors.
Autonomous systems are viewed as organisms in an
environment. Each organism has its own set of production
rules, a genetic code, that gives birth to the neural
structure. Another set of production rules describe the
environmental factors. These production rules together give
rise to a neural network embedded in the organism model. The
neural network is the only means to direct reproduction.
This gives rise to intelligence, organisms which have
``more'' intelligent methods to reproduce will have a
relative advantage for survival.
Jari Vaario
Research Center for Advanced Science and Technology
University of Tokyo, Japan
JARI HAS OFFERED TO PROVIDE COPIES OF HIS PAPERS TO READERS OF
THIS BULLETIN. PLEASE CONTACT HIM BY EMAIL FOR FURTHER DETAILS.
---------------------------------------------------------------------
From: "John R. Mcdonnell" <mcdonn%bach.nosc.mil@nosc.mil>
I have been developing networks which "evolve" structure as well as the
connection strengths. This is done using an evolutionary programming
paradigm as a mechanism for stochastic search. One thing I have noticed
is that particular structures tend to dominate the population (EP is a
multi-agent stochastic search technique). THis has caused me to backtrack
a little and investigate the search space for simultaneously determining
model structure and parameters.
Nevertheless, I have been able to "evolve" networks for simple binary
mappings such as XOR, 3-bit parity, and the T-C problem. In the future I
am aiming at more general (feedforward) networks which do not have layers
per se, but neuron class {input, hidden, output}. Self-organizing of
sub-groups of neurons would be an interesting phenomenon to observe for,
say, image recognition problems. I think that to accomplish this a
distance metric between neurons might be necessary.
Fahlman's cascade-correlation architecture is very interesting. However,
it has the constraint that the neurons be fully connected to subsequent
neurons in the network. This might not be detrimental in that unimportant
connections could have very small weights. From an information
standpoint, these free parameters should be included in a cost function.
I do like his approach in adding additional hidden nodes.
As one last comment, when I "evolve" (I use the term loosely) networks for
the XOR mapping with an additional input of U(0,1) noise, this third
(noisy) input node has all of its outputs disconnected. This was a nice
result since inputs which contain no information can be disconnected.
John McDonnell
NCCOSC, RDT&E Division
Code 731
Information & Signal Processing Dept.
San Diego, CA 92152-5000
<mcdonn@bach.nosc.mil>
-----------------------------------------------------------------------
From: "Larry D. Pyeatt" <pyeatt@texaco.com>
I have just completed some code to allow modelling of DANN's. It allows
PE's to be created, destroyed, and reconnected at any time........
I have been using genetic algorithms to construct networks with desired
properties. Encoding the genes is a major problem.......
I have been thinking that it would be interesting to try to evolve or
create a simple "creature" which lives in a computer simulated world. The
"creature" would have a small set of inputs and responses with which to
interact with the simulated world. Once the "creature" has evolved
sufficiently, you could make its world richer and give it more neurons.
Eventually, you might have a respectably complex organism.
Larry D. Pyeatt The views expressed here are not
Internet : pyeatt@texaco.com those of my employer or of anyone
Voice : (713) 975-4056 that I know of with the possible
exception of myself.
-----------------------------------------------------------------------
From: Bernie French <btf64@cas.org>
I noticed your Usenet post on the use of simple organisms as a model to
produce a biologically plausible DANN. One organism that would seem to
fit your need for producing a DANN is the nematode, Caenorhabditis
elegans. The positions of neuronal processes as well as the neuronal
connectivity has been extensively mapped in this organism. The
development in terms of cellular fates is also well studied for the
nervous system. Integration of neuronal subsystems into the neuronal
processes during development have been studied. This would fit your
description of a DANN where processing elements are created as part of a
development phase. Further, the two sexes of C. elegans (male and
hermaphrodite) have different numbers of total neurons as adults, 302 for
the hermaphrodite. Howver, during development of the neuronal processes
there is no difference between the two sexes. During a certain stage in
development there is the production of sex-specific neurons. This
sex-specificity occurs by a process of programmed cell death, in which
certain neurons are "programmed" to die. This fits your description of a
DANN where processing elements are annihilated as part of a development
phase.
This organism also provides some spatial information, since some neuronal
cells undergo migration within the organism. Disruption of this migration
results in synaptic differences in the neuronal connectivity, with
corresponding differences in the organism response to external stimuli.
A good starting reference, if your interested in looking at this organism
as a model, is "The Nematode Caenorhabditis Elegans". The book is edited
by William B. Wood and published by Cold Spring Harbor Laboratory.
-- Bernie (btf64@cas.org)
-------------------------------------------------------------------------
From: "Mark J. Crosbie" <mcrosbie@unix1.tcd.ie>
As part of a project which I am working on for my degree, I am studying
how to evolve machines for solving simple problems.
I saw that Cellular Automata were able to evolve colonies of cells and
control how these cells lived or died, but I felt that this was not a
powerful enough representation of cells to be of use. I have combined the
CA approach with a Genetic Algorithm approach within each CA to give me
colonies of evolving cells which can modify their behaviour over time.
Each cell can be pre-programmed to perform a certain task (an adder or
multiplexor say) and the first hurdle in the project is getting these
cells to grow together and interconnect properly. I think that study of
how cells grow and interconnect will lead to not only a better
understanding of the nervous systems of living organisms, but also of how
to solve problems using these "Genetic Programming" techniques.
I feel that this idea overlaps somewhat with your DANN which you described
in comp.theory.cell-automata. Do you agree? Would you agree that building
simple machines by genetic means would be a better starting point for
experimentation than trying to simulate a complex nervous system? Given
enough of these cells and a complex enough interconnection system, do you
feel that a system will evolve which will equal a nervous system in
functionality and intelligence?
Mark Crosbie
mcrosbie@vax1.tcd.ie
Dept. of Computer Science
Trinity College, Dublin
Dublin 2
Eire.
--------------------------------------------------------------------
From: "Mark W. Tilden" <mwtilden@math.uwaterloo.ca>
Forcing a DANN, as you call it, through simple topological structures with
recursive sub-elements does tend towards complexity befitting a functional
organism with surprisingly few components. In my lab I have a variety of
robotic devices with adaptive nervous systems not exceeding the equivalent
of 10 neurons. These devices not only learn to walk from first principles
but can also adapt to many different circumstances including severe
personal damage. There are no processors involved; my most complex device
uses only 50 transistors for its entire spectrum of behavior, response and
control.
My point is that there are simple, elegant solutions to "constructing a
working DANN". More so than you might expect.
I'm sorry I have no papers to quote as I am awaiting patents, but I will
be at the Brussels Alife show in May with some of my devices, or you can
check out an article on me in the September 92 issue of Scientific
American.
Mark W. Tilden
M.F.C.F Hardware Design Lab.
U of Waterloo. Ont. Can, N2L-3G1
(519)885-1211 ext. 2454
-----------------------------------------------------------------------
From: Stanley Zietz <szietz@king.mcs.drexel.edu>
05 Jan 93
You may not need to model simple organisms, but simple structures. Since
you quote Muriel Ross's paper, you probably know that the Biocomputation
Center at NASA-Ames is exhaustively studying the linear accelerometer in
the inner ear as a prototypical system to make 'real' biological neural
networks.
07 Jan 93
To paraphrase many of the papers, it has been shown that the geometry of
the connection of the hair cells (before the Spike initiation zone) is
critical to the summation of events, and indeed there is a lot of
variability (perhaps throwing in a stochastic element into the network).
Also, as Muriel reported at headquarters, the results of analyzing the
space flight animals have shown that the number of synapses is very
plastic in microgravity - the number of synapses increase in space (and
decrease if you put the animals in hypergravity on a centrifuge).
Therefore the synapses (and presumably the electrical conduction in the
network) is responding to the input. Such self adaption is probably very
important in biological communications systems (Steve Grossberg has
expounded such a concept in 1979). We already know that such environment
driven events are important in development.
Dr. Stanley Zietz email szietz@mcs.drexel.edu
Assoc. Director - Biomed. Eng. and Sci. Inst. tel (215) - 895-2681
Assoc. Prof. - Mathematics and Computer Sciences Fax (215) - 895-4983
Drexel University - 32nd and Chestnut Sts. Phila., PA 19104 USA
also
Biocomputation Center
NASA-Ames Research Center
--------------------------------------------------------------------------
From: Bill Saidel <saidel@clam.rutgers.edu>
The question that remains [see ref. 1 in my post, also the Zietz reply]
is whether a hard wired (where hard is a relative term,
relative to say cortex in the cns) set of connections from a
peripheral sensory receptor to the peripheral afferent fiber and
then to the brain comprises a "net" in the same sense as
neural net is used. An analogous example would be from cones to
bipolar (and NOT to ganglion cell). My omission of ganglion cell
is simply that ganglion cell in the serial ordering of processing
would be equivalent to the first order vestibular neuron in the
vestibular nuclei of the hindbrain. This series of connections seems
to qualify as a genetically-determined (and I am probably
overconstraining the use of determined) set of connections
(receptor to next layer). I know of no learning treatment that
changes this layer of connections. However, manipulating the
sensory input in the retina by strobing does produce strange
deficits in frog and cat velocity detection (and other features).
The Ross argument has depended on looking at the EM level of connections.
These connections are all at the level of the sensory periphery and
so they are under normal circumstances, not manipulable. Nets in
the cortex are manipulable as are nnets in computer simulations.
However, Ross has also been involved in an intriguing study
of the structure of synaptic connections in rats or mice that were born
in space (I think) and found that the synaptic connections are
fewer when gravity is missing or diminished.
I prefer to think of that manipulation of the nervous periphery more
as an example of experimental epistomology because the change occurred due
to changes inthe biophysical structure of experience. Again, let me
point out that at the periphery the neuronal processing is
driven by biophysics. In the cns, neuronal processing is driven by
preceeding nerve cells that know nothing about the outside world.
Perhaps, this distinction is the one that I use to distinguish between
nets and constructions (a possibly artificial distinction but to my
mind, useful one).
Bill Saidel
Dept. of Biology
(609) 225-6336 (phone)Rutgers University
Camden, NJ 08102
saidel@clam.rutgers.edu (email)
----------------------------------------------------------------------------
EDITED REPRINT OF NEWSPOST WITH ADDITIONAL REFERENCE
From: Scott E. Fahlman <sef@sef-pmax.slisp.cs.cmu.edu>
I would just point out that adding or subtracting neurons from the
functional net does not necessarily correspond to adding, destroying, or
moving any physical neurons. If a physical neuron (or functional group of
neurons) has a lot of excess connections, invisible changes in the
synapses can effectively wire it into the net in a large number of
different ways, or effectively remove it.
Something like my dynamic (additive) Cascade-Correlation model can be
implemented by a sort of phase change, rather than a re-wiring of the net:
A candidate unit has a lot of trainable inputs, but it produces no output
(or all the potential recipients ignore its output). After a unit is
tenured, a specific pattern of input weights is frozen in, but now the
neuron does produce effective outputs. I don't know if such a phase
transition has been observed in biological neurons -- it would be
interesting to find out. Note that what I'm calling a "unit" might
correspond to a group of biological neurons rather than a single one.
Scott E. Fahlman Internet: sef+@cs.cmu.edu
Senior Research Scientist Phone: 412 268-2575
School of Computer Science Fax: 412 681-5739
Carnegie Mellon University Latitude: 40:26:33 N
5000 Forbes Avenue Longitude: 79:56:48 W
Pittsburgh, PA 15213
REFERENCE
LEARNING WITH LIMITED NUMERICAL PRECISION USING THE
CASCADE-CORRELATION ALGORITHM
HOEHFELD M, FAHLMAN SE
IEEE TRANSACTIONS ON NEURAL NETWORKS 1992 VOL.3 NO.4 PP.602-611
------------------------------------------------------------------------
From: Paul Verschure <verschur@ifi.unizh.ch>
The work we're doing would fit pretty well in your approach. Let me give
you some references:
Verschure, P.F.M.J., Krose, B, Pfeifer, R.(1992) Distributed Adaptive
Control: The self-organization of structured behavior. Robotics and
Autonomous Systems, 9, 181-196. (Control architectures for Autonomous
agents based on self-organizing model for classical conditioning).
Verschure, P.F.M.J., & Coolen, T. (1991) Adaptive Fields: Distributed
representations of clasically conditioned associations. Network, 2,
189-206. (Two neural models for reinforcement learning which do NOT rely
on supervised learning and local representations and incorporate some
general properties of neural functioning, the models are analyzed using
techniques from statistical physics and not with simulations)
Verschure, P.F.M.J. (1992) Taking connectionism seriously: The vague
promis of subsymbolism and an alternative. In Proc. 14th Ann. Conf. of the
Cog. Sci. Soc., 653-658, Hillsdale, N.J.: Erlbaum.
Paul Verschure AI lab Department of Computer Science
University Zurich-Irchel Tel + 41 - 1 - 257 43 06
Winterthurerstrasse 190 Fax + 41 - 1 - 363 00 35
CH - 8057 Zurich, Switzerland verschur@ifi.unizh.ch
------------------------------------------------------------------
For after all what is man in nature? A nothing in relation to infinity,
all in relation to nothing, a central point between nothing and all, and
infinitely far from understanding either. The ends of things and their
beginnings are impregnably concealed from him in an impenetrable secret.
He is equally incapable of seeing the nothingness out of which he was
drawn and the infinite in which he is engulfed.
Blaise Pascal (1623-1662)
--------------------------------------------------------------------------
Paul Fawcett | Internet: paulf@manor.demon.co.uk
London, UK. | tenec@westminster.ac.uk
--------------------------------------------------------------------------
------------------------------
End of Neuron Digest [Volume 11 Issue 13]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
5442; Mon, 01 Mar 93 18:50:17 EST
Received: from noc2.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Mon, 01 Mar 93 18:50:13 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc2.dccs.upenn.edu
id AA01999; Mon, 1 Mar 93 18:43:34 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA01365; Fri, 26 Feb 93 02:39:52 EST
Posted-Date: Fri, 26 Feb 93 02:39:14 EST
From: "Neuron-Digest Moderator" <neuron-request@cattell.psych.upenn.edu>
To: Neuron-Distribution:;
Subject: Neuron Digest V11 #14 (discussion + jobs)
Reply-To: "Neuron-Request" <neuron-request@cattell.psych.upenn.edu>
X-Errors-To: "Neuron-Request" <neuron-request@cattell.psych.upenn.edu>
Organization: University of Pennsylvania
Date: Fri, 26 Feb 93 02:39:14 EST
Message-Id: <1359.730712354@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Friday, 26 Feb 1993
Volume 11 : Issue 14
Today's Topics:
neural net applications to fixed-income security markets
connectionist models summer school -- final call for applications
RE: Neuron Digest V11 #8 (discussion + reviews + jobs)
Re: Speaker normalization and adaptation
A computationally efficient squashing function
BP network paralysis
Re: pattern recognition (pratical database considerations) ?
Re: pattern recognition (pratical database considerations)
Computational Biology Degree Programs
postdoctoral traineeships available
Postdoc position in computational/biological vision (learning)
Position for Programmer/Analyst with Neural Networks (YALE)
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: neural net applications to fixed-income security markets
From: danlap@internet.sbi.com (Dan LaPushin)
Date: Mon, 08 Feb 93 18:26:43 -0500
[[ Editor's Note: Once again, neural nets has reached Wall Street, but
this time from quite a different angle. A cursory search in past issues
of the Digest turned up nothing of relevance. Perhaps one of our
faithful readers might either lend a helpful ear or might get interested
in this as a new project via email! -PM ]]
To The Editor,
I am new to the field of neural networks but have a strong background in
mathematics, economics, and some computer programming. I work at a large
Wall St. firm and am interested in applying neural network technology to
the field of fixed-income research. Such instruments include bonds,
mortgage-backed securities and the like. There seems to be, as far as I
can tell, little research into neural net application to such markets. I
suspect this is because the data is hard to come by for those not in the
field, but I'm not sure. Could you direct me to any research in this
area so that I don't inadvertently recreate the wheel? Thanks for your
help!
Dan LaPushin
I'm on your mailing list as danlap@sp_server.sbi.com
------------------------------
Subject: connectionist models summer school -- final call for applications
From: "Michael C. Mozer" <mozer@dendrite.cs.colorado.edu>
Date: Thu, 11 Feb 93 22:10:05 -0700
FINAL CALL FOR APPLICATIONS
CONNECTIONIST MODELS SUMMER SCHOOL
The University of Colorado will host the 1993 Connectionist
Models Summer School from June 21 to July 3, 1993. The purpose
of the summer school is to provide training to promising young
researchers in connectionism (neural networks) by leaders of the
field and to foster interdisciplinary collaboration. This will
be the fourth such program in a series that was held at
Carnegie-Mellon in 1986 and 1988 and at UC San Diego in 1990.
Previous summer schools have been extremely successful and we
look forward to the 1993 session with anticipation of another
exciting event.
The summer school will offer courses in many areas of
connectionist modeling, with emphasis on artificial intelligence,
cognitive neuroscience, cognitive science, computational methods,
and theoretical foundations. Visiting faculty (see list of
invited faculty below) will present daily lectures and tutorials,
coordinate informal workshops, and lead small discussion groups.
The summer school schedule is designed to allow for significant
interaction among students and faculty. As in previous years, a
proceedings of the summer school will be published.
Applications will be considered only from graduate students
currently enrolled in Ph.D. programs. About 50 students will be
accepted. Admission is on a competitive basis. Tuition will be
covered for all students, and we expect to have scholarships
available to subsidize housing and meal costs, but students are
responsible for their own travel arrangements.
Applications should include the following materials:
* a vita, including mailing address, phone number, electronic
mail address, academic history, list of publications (if any),
and relevant courses taken with instructors' names and grades
received;
* a one-page statement of purpose, explaining major areas of
interest and prior background in connectionist modeling and
neural networks;
* two letters of recommendation from individuals familiar with
the applicants' work (either mailed separately or in sealed
envelopes); and
* a statement from the applicant describing potential sources of
financial support available (department, advisor, etc.) for
travel expenses.
Applications should be sent to:
Connectionist Models Summer School
c/o Institute of Cognitive Science
Campus Box 344
University of Colorado
Boulder, CO 80309
All application materials must be received by March 1, 1993.
Admission decisions will be announced around April 15. If you
have specific questions, please write to the address above or
send e-mail to "cmss@cs.colorado.edu". Application materials
cannot be accepted via e-mail.
Organizing Committee
Jeff Elman (UC San Diego)
Mike Mozer (University of Colorado)
Paul Smolensky (University of Colorado)
Dave Touretzky (Carnegie Mellon)
Andreas Weigend (Xerox PARC and University of Colorado)
Additional faculty will include:
Yaser Abu-Mostafa (Cal Tech)
Sue Becker (McMaster University)
Andy Barto (University of Massachusetts, Amherst)
Jack Cowan (University of Chicago)
Peter Dayan (Salk Institute)
Mary Hare (Birkbeck College)
Cathy Harris (Boston University)
David Haussler (UC Santa Cruz)
Geoff Hinton (University of Toronto)
Mike Jordan (MIT)
John Kruschke (Indiana University)
Jay McClelland (Carnegie Mellon)
Ennio Mingolla (Boston University)
Steve Nowlan (Salk Institute)
Dave Plaut (Carnegie Mellon)
Jordan Pollack (Ohio State)
Dean Pomerleau (Carnegie Mellon)
Dave Rumelhart (Stanford)
Patrice Simard (ATT Bell Labs)
Terry Sejnowski (UC San Diego and Salk Institute)
Sara Solla (ATT Bell Labs)
Janet Wiles (University of Queensland)
The Summer School is sponsored by the American Association for
Artificial Intelligence, the National Science Foundation, Siemens
Research Center, and the University of Colorado Institute of
Cognitive Science.
Colorado has recently passed a law explicitly denying protection
for lesbians, gays, and bisexuals. However, the Summer School
does not discriminate in admissions on the basis of age, sex,
race, national origin, religion, disability, veteran status, or
sexual orientation.
------------------------------
Subject: RE: Neuron Digest V11 #8 (discussion + reviews + jobs)
From: rkeller@academic.cc.colorado.edu
Date: Sun, 14 Feb 93 12:18:03 -0700
James L. McClelland and David E. Rumelhart provide code for a Boltzmann
Machine in "Explorations in Parallel Distributed Processing: A Handbook
of Models, Programs and Exercises" The code is written for a PC. This
is A Bradford Book available from The MIT Press and was designed to
provide excersises to accompany their two volume book "Parallel
Distributed Processing."
------------------------------
Subject: Re: Speaker normalization and adaptation
From: Yaakov Stein <stein@galaxy.huji.ac.il>
Date: Wed, 17 Feb 93 07:51:07 +0200
Nico Weymaere (WEYMAERE@lem.rug.ac.be) asked for references on speaker
normalization and adaptation. While the idea of exploiting the self
organization and learning capabilities of neural networks for this task
seems quite natural, I have not seen much in the proceedings of NN
conferences. The related questions of speaker independent recognition and
speaker identification / verification have been far more thoroughly
treated in this literature. In the speech and signal processing
conferences more has appeared. A quick search through my article file
turned up the following :
Bridle JS and Cox SJ, RecNorm: Simultaneous Normalization and
Classification Applied to Speech Recognition, NIPS-3, 234-40 (1991)
Cox SJ and Bridle JS, Simultaneous Speaker Normalization and Utterance
Labelling Using Bayesian/Neural Net Techniques,
ICASSP-90 article S3.8, vol 1, 161-4 (1990)
Hampshire JB II and Waibel AH, The Meta-Pi Network: Connectionist Rapid
Adaptation for High Performace Multi-Speaker Phoneme Recognition,
ICASSP-90 article S3.9, vol 1, 165-8 (1990)
Fukuzawa K and Komori Y, A Segment-based Speaker Adaptation Neural Network
Applied to Continuous Speech Recognition, ICASSP-92, I 433-6 (1992)
Huang X, Speaker Normalization for Speech Recognition, ICASSP-92,
I 465-8 (1992)
Iso K, Asogawa M, Yoshida K and Watanabe T, Speaker Adaptation Using
Neural Network, Proc. Spring Meeting of Acoust. Soc. of Japan,
I 6-16 (March 1989) [in Japanese, quoted widely but I don't have it]
Konig Y and Morgan N, GDNN: A Gender-Dependent Neural Network for
Continuous Speech Recognition, IJCNN-92(Baltimore), II 332-7 (1992)
Montacie C, Choukri K and Chollet G, Speech Recognition using Temporal
Decomposition and Multilayer Feedforward Automata,
ICASSP-89 article S8.6, vol 1, 409-12 (1989)
Nakamura S and Akabane T, A Neural Speaker Model for Speaker Clustering,
ICASSP-91 article S13.6, vol 2, 853-856 (1991)
Nakamura S and Shikano K, Speaker Adaptation Applied to HMM and
Neural Networks, ICASSP-89 article S3.3, vol 1, 89-92 (1989)
Nakamura S and Shikano K, A Comparative Study of Spectral Mapping for
Speaker Adaptation, ICASSP-90 article S3.7, vol 1, 157-160 (1990)
Schmidbauer O and Tebelskis J, An LVQ Based Reference Model for Speaker
Adaptative Speech Recognition, ICASSP-92, I 441-4 (1992)
Witbrock M and Hoffman P, Rapid Connectionist Speaker Adaptation,
ICASSP-92, pp. I 453-6 (1992)
Hope this helps.
Yaakov Stein
------------------------------
Subject: A computationally efficient squashing function
From: "Michael P. Perrone" <mpp@cns.brown.edu>
Date: Thu, 18 Feb 93 15:42:34 -0500
Recently on the comp.ai.neural-nets bboard, there has been a discussion
of more computationally efficient squashing functions. Some colleagues
of mine suggested that many members of this mailing list may not have
access to the comp.ai.neural-nets bboard; so I have included a summary
below.
Michael
- ------------------------------------------------------
David L. Elliot mentioned using the following neuron activation function:
x
f(x) = -------
1 + |x|
He argues that this function has the same qualitative properties of the
hyperbolic tangent function but in practice faster to calculate.
I have suggested a similar speed-up for radial basis function networks:
1
f(x) = -------
1 + x^2
which avoids the transcendental calculation associated with gaussian RBF
nets.
I have run simulations using the above squashing function in various
backprop networks. The performance is comparable (sometimes worse
sometimes better) to usual training using hyperbolic tangents. I also
found that the performance of networks varied very little when the
activation functions were switched (i.e. two networks with identical
weights but different activation functions will have comparable performance
on the same data). I tested these results on two databases: the NIST OCR
database (preprocessed by Nestor Inc.) and the Turk and Pentland human face
database.
-
------------------------------------------------------------------------------
--
Michael P. Perrone Email:
mpp@cns.brown.edu
Institute for Brain and Neural Systems Tel: 401-863-3920
Brown University Fax: 401-863-3934
Providence, RI 02912
------------------------------
Subject: BP network paralysis
From: slablee@mines.u-nancy.fr
Date: Fri, 19 Feb 93 13:55:35 +0700
Dear Netters,
My english is a bit frenchy (!) so please excuse some poor
sentences !
I'm trying to use NN to detect the start of a sampled signal.
I'm using a 7520 x 150 x 70 x 1 BackPropagation network.
My problem is : whatever could be the parameters
I choose (learning rate, momentum...) the Network
stop learning with a rather high error (about 0.4 for each
unit, into a [0,1] range ! ).
I thought of two problems :
- network "paralysis" (as describe by Rumelhart) involved
by too high weight (which leads to activations near 0 or 1,
preventing the weights from being changed : the changes are
proportional to a(1-a) ). But the weights of my network always
have average values...
- some local minima. But a great value for the learning rate seems
to change nothing to it. I've tried to add a noise in the input units,
whithout any success. I've also tried to change the number of hidden units,
but the local minima are always here, even if lower.
Who could help me to escape from this problem ?
--------------------------------------------------
| Stephane Lablee |
| ***** |
| Ecole des Mines de Nancy |
| Parc de Saurupt |
| 54042 Nancy Cedex |
| France |
| ***** |
| E-mail : slablee@mines.u-nancy.fr |
--------------------------------------------------
- --
------------------------------
Subject: Re: pattern recognition (pratical database considerations) ?
From: gray@itd.nrl.navy.mil (Jim Gray)
Date: Fri, 19 Feb 93 09:33:19 -0500
Duane A. White writes:
> I am interested in pattern recognition. In my particular application I
> would like to compare a 2D monochrome bitmap image with those in a
> database. The program should determine if there is a match, and if not
> then add it to the database. Most of the literature I've read on pattern
> matching networks use a relatively small set of classification patterns
> (such as letters of the alphabet, numbers). In my case it wouldn't seem
> practical to train a single network to identify every entry in the
> database (on the order of hundreds or thousands of entries). Is there
> something fundemental in the approach that I'm missing?
You might try looking at Adaptive Resonance Theory (ART).
A good place to start is the book:
Carpenter and Grossberg, eds., Pattern Recognition by Self-Organizing
Neural Networks, The MIT Press, Cambridge, MA (1991)
ISBN 0-262-03176-0
I'm not sure whether ART networks can be applied to "thousands of entries"
in practice, but the basic operation is as you describe: the network
determines if there is a match, and if not, then adds it to the database.
> Also the program should to a small degree be rotation and translation
> invariant.
I'm not sure whether ART networks have been applied to this type of
problem, but you might try looking at:
Hinton, "A Parallel Computation that assigns Canonical Object-Based
Frames of Reference", in Proceedings of the International Joint
Conference on Artificial Intelligence, 1981, pp. 683-685.
Jim Gray.
------------------------------
Subject: Re: pattern recognition (pratical database considerations)
From: shsbishp@reading.ac.uk
Date: Tue, 23 Feb 93 11:26:45 +0000
>I am interested in pattern recognition. In my particular application I
>would like to compare a 2D monochrome bitmap image with those in a
>database. The program should determine if there is a match, and if not
>then add it to the database. Most of the literature I've read on pattern
>matching networks use a relatively small set of classification patterns
>(such as letters of the alphabet, numbers). In my case it wouldn't seem
>practical to train a single network to identify every entry in the
>database (on the order of hundreds or thousands of entries). Is there
>something fundemental in the approach that I'm missing?
>
>Also the program should to a small degree be rotation and translation
>invariant.
Having just perused todays neural digest (Vol.11; No. 12), I noticed the
above plea for help. Having been unable to email the sender direct, I enclose
the following information for the list.
As part of my doctoral research I developed a neural architecture (The
Stochastic Search Network) for use on this type of problem - Anarchic
Techniques for Pattern Classification, PhD thesis 1989, University of
Reading, UK. A recent reference on this work is; Bishop, J.M. & Torr, P.,
in Lingard, R., Myers, D.J. & Nightingale, C. (eds), Neural Networks for
Vision, Speech & Natural Language, Chapman Hall, pp: 370-388.
For further information please email to (shsbishp@uk.ac.rdg) or write to
Dr. J.M.Bishop, Department of Cybernetics, University of Reading,
Berkshire, UK.
------------------------------
Subject: Computational Biology Degree Programs
From: georgep@rice.edu (George Phillips)
Organization: Rice University
Date: 05 Feb 93 15:38:31 +0000
The W.M. Keck Center for Computational Biology offers studies in
Computational Biology through three partner institutions: Rice
University, Baylor College of Medicine, and the University of Houston.
Science and engineering are in the process of being transformed by the
power of new computing technologies. Our goal is to train a new kind of
scientist--one poised to seize the advantages of a national computational
prowess in solving important problems in biology.
The program emphasizes algorithm development, computation, and
visualization in biology, biochemistry and biophysics. The Program draws
on the intellectual and technologic resources of The Center for Research
on Parallel Computation at Rice, the Human Genome Center at Baylor
College of Medicine, and the Institute for Molecular Design at the
University of Houston, among others.
The research groups involved in the W.M. Keck Center for Computational
Biology are at the forefronts of their respective areas, and their
laboratories are outstanding settings for the program.
A list of participating faculty and application information can be
obtained by sending email to georgep@rice.edu.
======================================+=======================================
Prof. George N. Phillips, Jr., Ph.D. | InterNet: georgep@rice.edu
Dept. of Biochemistry and Cell Biology|
Rice University, P.O. Box 1892 | Phone: (713) 527-4910
Houston, Texas 77251 | Fax: (713) 285-5154
------------------------------
Subject: postdoctoral traineeships available
From: "John K. Kruschke" <KRUSCHKE@ucs.indiana.edu>
Date: Tue, 09 Feb 93 09:45:45 -0500
POST-DOCTORAL FELLOWSHIPS AT INDIANA UNIVERSITY
Postdoctoral Traineeships in MODELING OF COGNITIVE PROCESSES
Please call this notice to the attention of all interested parties.
The Psychology Department and Cognitive Science Programs at Indiana
University are pleased to announce the availability of one or more
Postdoctoral Traineeships in the area of Modeling of Cognitive
Processes. The appointment will pay rates appropriate for a new PhD
(about $18,800), and will be for one year, starting after July 1,
1993. The duration could be extended to two years if a training grant
from NIH is funded as anticipated (we should receive final
notification by May 1).
Post-docs are offered to qualified individuals who wish to further
their training in mathematical modeling or computer simulation
modeling, in any substantive area of cognitive psychology or Cognitive
Science.
We are particularly interested in applicants with strong
mathematical, scientific, and research credentials. Indiana University
has superb computational and research facilities, and faculty with
outstanding credentials in this area of research, including Richard
Shiffrin and James Townsend, co-directors of the training program, and
Robert Nosofsky, Donald Robinson, John Castellan, John Kruschke,
Robert Goldstone, Geoffrey Bingham, and Robert Port.
Trainees will be expected to carry out original theoretical and
empirical research in association with one or more of these faculty
and their laboratories, and to interact with other relevant faculty
and the other pre- and postdoctoral trainees.
Interested applicants should send an up to date vitae, personal
letter describing their specific research interests, relevant
background, goals, and career plans, and reference letters from two
individuals. Relevant reprints and preprints should also be sent.
Women, minority group members, and handicapped individuals are urged
to apply. PLEASE NOTE: The conditions of our anticipated grant
restrict awards to US citizens, or current green card holders. Awards
will also have a 'payback' provision, generally requiring awardees to
carry out research or teach for an equivalent period after termination
of the traineeship. Send all materials to:
Professors Richard Shiffrin and James Townsend,
Program Directors
Department of Psychology, Room 376B
Indiana University
Bloomington, IN 47405
We may be contacted at:
812-855-2722;
Fax: 812-855-4691
email: shiffrin@ucs.indiana.edu
Indiana University is an Affirmative Action Employer
------------------------------
Subject: Postdoc position in computational/biological vision (learning)
From: "John G. Harris" <harris@ai.mit.edu>
Date: Tue, 16 Feb 93 18:50:28 -0500
One (or possibly two) postdoctoral positions are available for one or two
years in computational vision starting September 1993 (flexible). The
postdoc will work in Lucia Vaina's laboratory at Boston University,
College of Engineering, to conduct research in learning the direction in
global motion. The researchers currently involved in this project are
Lucia M. Vaina, John Harris, Charlie Chubb, Bob Sekuler, and Federico
Girosi.
Requirements are PhD in CS or related area with experience in visual
modeling or psychophysics. Knowledge of biologically relevant neural
models is desirable. Stipend ranges from $28,000 to $35,000 depending
upon qualifications. Deadline for application is March 1, 1993. Two
letter of recommendation, description of current research and an up to
date CV are required.
In the research we combine computational psychophysics, neural networks
modeling and analog VLSI to study visual learning specifically applied to
direction in global motion. The global motion problem requires estimation
of the direction and magnitude of coherent motion in the presence of
noise. We are proposing a set of psychophysical experiments in which the
subject, or the network must integrate noisy, spatially local motion
information from across the visual field in order to generate a response.
We will study the classes of neural networks which best approximate the
pattern of learning demonstrated in psychophysical tasks. We will explore
Hebbian learning, multilayer perceptrons (e.g. backpropagation),
cooperative networks, Radial Basis Function and Hyper-Basis Functions.
The various strategies and their implementation will be evaluated on the
basis of their performance and their biological plausibility.
For more details, contact Prof. Lucia M. Vaina at vaina@buenga.bu.edu or
lmv@ai.mit.edu.
------------------------------
Subject: Position for Programmer/Analyst with Neural Networks (YALE)
From: Anand Rangarajan <rangarajan-anand@CS.YALE.EDU>
Date: Thu, 18 Feb 93 13:18:40 -0500
Programmer/Analyst Position
in Artificial Neural Networks
The Yale Center for Theoretical
and Applied Neuroscience (CTAN)
and the
Department of Computer Science
Yale University, New Haven, CT
We are offering a challenging position in software engineering in support of
new techniques in image processing and computer vision using artificial neural
networks (ANNs).
1. Basic Function:
Designer and programmer for computer vision and neural network
software at CTAN and the Computer Science department.
2. Major duties:
(a) To implement computer vision algorithms using a Khoros (or similar)
type of environment.
(b) Use the aforementioned tools and environment to run and analyze
computer experiments in specific image processing and vision application
areas.
(c) To facilitate the improvement of neural network algorithms and
architectures for vision and image processing.
3. Position Specifications:
(a) Education:
BA, including linear algebra, differential equations, calculus.
helpful: mathematical optimization.
(b) Experience:
programming experience in C++ (or C) under UNIX.
some of the following: neural networks, vision or image processing
applications, scientific computing, workstation graphics,
image processing environments, parallel computing, computer algebra
and object-oriented design.
Preferred starting date: March 1, 1993.
For information or to submit an application, please write:
Eric Mjolsness
Department of Computer Science
Yale University
P. O. Box 2158 Yale Station
New Haven, CT 06520-2158
e-mail: mjolsness-eric@cs.yale.edu
Any application must also be submitted to:
Jeffrey Drexler
Department of Human Resources
Yale University
155 Whitney Ave.
New Haven, CT 06520
- -Eric Mjolsness and Anand Rangarajan
(prospective supervisors)
------------------------------
End of Neuron Digest [Volume 11 Issue 14]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
5802; Sat, 27 Feb 93 18:47:54 EST
Received: from noc2.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
TCP; Sat, 27 Feb 93 18:47:51 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc2.dccs.upenn.edu
id AA04341; Sat, 27 Feb 93 18:42:56 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
id AA09209; Sat, 27 Feb 93 17:09:40 EST
Posted-Date: Sat, 27 Feb 93 17:09:10 EST
From: "Neuron-Digest Moderator" <neuron-request@cattell.psych.upenn.edu>
To: Neuron-Distribution:;
Subject: Neuron Digest V11 #15 (discussion, software, & jobs)
Reply-To: "Neuron-Request" <neuron-request@cattell.psych.upenn.edu>
X-Errors-To: "Neuron-Request" <neuron-request@cattell.psych.upenn.edu>
Organization: University of Pennsylvania
Date: Sat, 27 Feb 93 17:09:10 EST
Message-Id: <9204.730850950@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu
Neuron Digest Saturday, 27 Feb 1993
Volume 11 : Issue 15
Today's Topics:
Applying Standards to Neural Networks
Sheet of neurons simulation
Re: Sheet of neurons simulation
Re: Sheet of neurons simulation
NNET model choice.
Handbook of Neural Algorithms
COMPUTER STANDARDS & INTERFACES addendum
Position Available at JPL
lectureship
Industrial Position in Artificial Intelligence and/or Neural Networks
lectureship in cognitive science
Microsoft Speech Research
Neural Computation & Cognition: Opening for NN Programmer
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Applying Standards to Neural Networks
From: erwin@trwacs.fp.trw.com (Harry Erwin)
Organization: TRW Systems Division, Fairfax VA
Date: 12 Feb 93 16:54:57 +0000
I was asked to review a proposal concerning the standardization of
vocabulary for machine learning and neural networks. This is being
distributed by the U.S. Technical Advisory Group to ANSI (JTC1 TAG). X3K5
is coordinating and developing a recommended position to JTC1 TAG for
approval for submission to ISO/IEC JTC 1. This recommendation has to be
returned to the JTC1 TAG Administrator no later than 1 March, 1993. The
contact person is the
JTC1 TAG Administrator
Computer and Business Equipment Manufacturers Association (CBEMA)
1250 Eys Street NW, Suite 200
Washington, DC 20005-3922
phone: 202-737-8888 (Press 1 Twice)
The vocabulary whose definitions are being standardized include:
"knowledge acquisition"
"learning strategy"
"concept"
"concept learning"
"conceptual clustering"
"taxonomy formation"
"machine discovery"
"connectionist model"
"massively parallel processing"
"connection machine"
"connection system"
"neural network"
"connectionist network"
"neurocomputer"
"learning task"
"concept description"
"chunking"
"discrimination network"
"characteristic description"
"discriminant description"
"structural description"
"concept formation"
"partially learned concept"
"version space (of a concept)"
"description space"
"instance space (of a concept)"
"(concept) generalization"
"consistent generalization"
"constraint-based generalization"
"similarity-based generalization"
"complete generalization"
"specialization"
"caching (in machine learning)"
"concept validation"
"confusion matrix"
"rote learning"
"adaptive learning"
"advice taking"
"learning by being told"
"learning from instruction"
"incremental learning"
"supervised learning"
"inductive learning"
"learning from induction"
"deductive learning"
"analytic learning"
"explanation-based learning"
"operationalization"
"learning by analogy"
"associative learning"
"learning from observation and discovery"
"learning without a teacher"
"unsupervised learning"
"learning from examples"
"positive example"
"negative example"
"near-miss"
"credit/blame assignment"
"causal analysis"
"unit (in neural networks)"
"link (in neural networks)"
"stable coalition"
"hidden layer"
"back propagation"
"transfer function"
For example, a "neural network" or "connectionist network" is defined as a
"A network of neuron-like processors each of which performs some simple
logical function, typically a logic threshold function. NOTE A neural
network completes a computation when its units have finished exchanging
messages and updating their potential, and settle into a stable state."
A "hidden layer" is defined as "An object-oriented software layer which
contains the method of instruction delivery among different programs run
by different types of data. NOTE Every processor is told to block out any
program that does not apply to the data object stored in it. From the
user's point of view however it appears that different types of processors
run different programs."
--My recommendation on this proposal to the TRW representative to this
standardization body is to vote no, since it is highly premature to
standardize on terminology when the underlying concepts remain the subject
of such active research."
Cheers,
Harry Erwin
Internet: erwin@trwacs.fp.trw.com
------------------------------
Subject: Sheet of neurons simulation
From: fburton@nyx.cs.du.edu (Francis Burton)
Organization: University of Denver, Dept. of Math & Comp. Sci.
Date: 18 Feb 93 18:11:27 +0000
On behalf of a colleague, I am looking for software that can be used to
simulate large networks of connected neurons. The individual elements
would have fairly unsophisticated (possibly identical) input/output
properties. The topology of the network would be a flat sheet with random
local interconnections, but later he may want to extend it to several
layers. The program should run on a PC - preferably freeware, but he
would be willing to pay for a commercial product (though I don't imagine
there would be much of a market for such a program).
I suspect that typical programs for neural-nets are not well suited to
this particular problem -- please correct me if I am mistaken.
Thank you for any pointers or advice.
Francis Burton Physiology, Glasgow University, Glasgow G12 8QQ, Scotland.
041 339 8855 x8085 | JANET: F.L.Burton@glasgow.ac.uk !net: via mcsun & uknet
"A horse! A horse!" | INTERNET: via nsfnet-relay.ac.uk BITNET: via UKACRL
------------------------------
Subject: Re: Sheet of neurons simulation
From: hunter@work.nlm.nih.gov (Larry Hunter)
Organization: National Library of Medicine
Date: 18 Feb 93 22:48:18 +0000
Francis Burton asks:
On behalf of a colleague, I am looking for software that can be used to
simulate large networks of connected neurons.
Well, there are many public domain (or nearly so) neural network
simulators out there that can do arbitrary topologies and update rules,
at least with a little bit of programming. IMHO, by far the best, both
in terms of what comes with the system and how easy it is to program to
meet specific needs, is the Xerion system from University of Toronto. It
has wonderful graphical interfaces (X windows) and runs on practically
any Unix/X platform. It is originally designed for use in machine
learning and on artificial neural nets, but I think it offers a good
possibility for adaptation to natural neural network simulation. Also,
the author of the program, Drew van Camp is pretty accessible.
It is available by anonymous ftp from the host ai.toronto.edu in the
directory /pub/xerion.
Here's a snippet from the README file:
Xerion is a Neural Network simulator developed and used by the
connectionist group at the University of Toronto. It contains libraries of
routines for building networks, and graphically displaying them. As well
it contains an optimization package which can train nets using several
different methods including conjugate gradient. It is written in C and
uses the X window system to do the graphics. It is being given away free
of charge to Canadian industry and researchers. It comes with NO warranty.
This distribution contains all the libraries used to build the simulators
as well as several simulators built using them (Back Propagation,
Recurrent Back Propagation, Boltzmann Machine, Mean Field Theory, Free
Energy Manipulation, Kohonnen Net, Hard and Soft Competitive Learning).
Also included are some sample networks built for the individual
simulators.
There are man pages for the simulators themselves and for many of the C
language routines in the libraries. As well, xerion has online help
available once the simulators are started. There is a tutorial on using
Xerion in the 'doc' directory.
I hope this does what you want.
Larry
Lawrence Hunter, PhD.
National Library of Medicine
Bldg. 38A, MS-54
Bethesda. MD 20894 USA
tel: +1 (301) 496-9300
fax: +1 (301) 496-0673
internet: hunter@nlm.nih.gov
encryption: PGP 2.1 public key via "finger hunter@work.nlm.nih.gov"
------------------------------
Subject: Re: Sheet of neurons simulation
From: senseman@lucy.brainlab.utsa.edu (David M. Senseman)
Organization: University of Texas at San Antonio
Date: 19 Feb 93 13:32:04 +0000
In article <HUNTER.93Feb18144818@work.nlm.nih.gov> Hunter@nlm.nih.gov writes:
>IMHO, by far the best, both in terms of what
>comes with the system and how easy it is to program to meet specific needs,
>is the Xerion system from University of Toronto.
The original posting asked for something to run "on a PC." This sounds
unlikely to run on a PC even if it were running an X server.
However, if you can get a hold of a UNIX based workstation, (Sparc, SGI,
HP, IBM, etc), you might want to check out the Caltech Neurosimulator
called "GENESIS". GENESIS also sports a very nice X-windows based
front-end called "XODUS" (what else :).
Unlike Xerion which was primarily designed for "non-biological" neural
networks (i.e. back-propagation, etc.), GENESIS was designed from the
beginning to model REAL neurons. In fact GENESIS has a group of commands
that generates "sheets of neurons" and synaptically connects them to
other sheets. Real HHK action potentials, Calcium channels, dendrtitic
spines, etc, etc...
I'm at home so I don't have all the details here, but if any
one is interested, they can contact me by E-Mail.
Again this program MUST be run on a UNIX box that supports
X-Windows. If all you have is a PC, then this isn't for you.
David M. Senseman, Ph.D. | Imagine the Creator as a low
(senseman@lonestar.utsa.edu) | comedian, and at once the world
Life Sciences Visualization Lab | becomes explicable.
University of Texas at San Antonio | H.L. Mencken
------------------------------
Subject: NNET model choice.
From: "Don" <schiewer@pa310m.inland.com>
Date: Fri, 19 Feb 93 14:34:56 -0600
I need some help selecting a NNET model to use for a classification
problem which involves looking at 20 thermal couples over a period of
10-20 samples. (continuously) The idea is to respond to a fault
condition. (fault/nofault)
I am considering Grossberg's STC(spacio-temperal classifier) model.
We will be implementing on NeuralWare's Neuralmaker PRO II.
Does any one know of other models or have info on how best to make this
work?
Thanks in advance.
Don Schiewer | Internet schiewer@pa881a.inland.com | Onward Great
Inland Steel | UUCP: !uucp!pa881a.inland!schiewer | Stream...
------------------------------
Subject: Handbook of Neural Algorithms
From: "Sean Pidgeon" <pidgeon@a1.relay.upenn.edu>
Date: Thu, 25 Feb 93 11:58:01 -0500
I would like to thank all those who took the trouble to respond to the
questionnaire posted in the 23 September 1992 issue by my colleague Tamara
Isaacs-Smith. The level of interest in our proposed Handbook has been
gratifying. A focus group was convened in Philadelphia on February 23 to
discuss the best way forward for the project, and our editorial plan is now
quite well developed.
All those interested in learning more about the Handbook project are
invited to contact me directly or to visit the IOP Publishing booth at the
World Congress on Neural Networks in Portland. Again, thanks for your
support.
------------------------------
Subject: COMPUTER STANDARDS & INTERFACES addendum
From: John Fulcher <john@cs.uow.edu.au>
Date: Fri, 26 Feb 93 13:55:36 -0500
COMPUTER STANDARDS & INTERFACES (North-Holland)
Forthcoming Special Issue on ANN Standards
ADDENDUM TO ORIGINAL POSTING
Prompted by enquiries from several people regarding my original Call for
Papers posting, I felt I should offer the following additional information
(clarification).
By ANN "Standards" we do not mean exclusively formal standards (in the ISO,
IEEE, ANSI, CCITT etc. sense), although naturally enough we will be
including papers on activities in these areas.
"Standards" should be interpreted in its most general sense, namely as
standard APPROACHES (e.g. the backpropagation algorithm & its many
variants). Thus if you have a paper on some (any?) aspect of ANNs,
provided it is prefaced by a summary of the standard approach(es) in that
particular area, it could well be suitable for inclusion in this special
issue of CS&I. If in doubt, post fax or email a copy by April 30th to:
John Fulcher,
Department of Computer Science,
University of Wollongong,
Northfields Avenue,
Wollongong NSW 2522,
Australia.
fax: +61 42 213262
email: john@cs.uow.edu.au.oz
------------------------------
Subject: Position Available at JPL
From: Padhraic Smyth <pjs@bvd.Jpl.Nasa.Gov>
Date: Thu, 18 Feb 93 11:49:36 -0800
We currently have an opening in our group for a new PhD graduate
in the general area of signal processing and pattern recognition.
While the job description does not mention neural computation per
se, it may be of interest to some members of the this
mailing list. For details see below.
Padhraic Smyth, JPL
RESEARCH POSITION AVAILABLE
AT THE
JET PROPULSION LABORATORY,
CALIFORNIA INSTITUTE OF TECHNOLOGY
The Communications Systems Research Section at JPL has an immediate
opening for a permanent member of technical staff in the area of
adaptive signal processing and statistical pattern recognition.
The position requires a PhD in Electrical Engineering or a closely
related field and applicants should have a demonstrated ability
to perform independent research.
A background in statistical signal processing is highly desirable.
Background in information theory, estimation and detection, advanced
statistical methods, and pattern recognition, would also be a plus.
Current projects within the group include the use of hidden Markov
models for change detection in time series, and statistical methods
for geologic feature detection in remotely sensed image data. The
successful applicant will be expected to perform both basic and
applied research and to propose and initiate new research projects.
Permanent residency or U.S. citizenship is not a strict requirement
- however, candidates not in either of these categories should be
aware that their applications will only be considered in
exceptional cases.
Interested applicants should send their resume (plus any supporting
background material such as recent relevant papers) to:
Dr. Stephen Townes
JPL 238-420
4800 Oak Grove Drive
Pasadena, CA 91109.
(email: townes@bvd.jpl.nasa.gov)
------------------------------
Subject: lectureship
From: Tony_Prescott <tony@aivru.shef.ac.uk>
Date: Fri, 19 Feb 93 10:59:46 +0000
LECTURESHIP IN COGNITIVE SCIENCE
University of Sheffield, UK.
Applications are invited for the above post tenable from 1st October 1993
for three years in the first instance but with expectation of renewal.
Preference will be given to candidates with a PhD in Cognitive Science,
Artificial Intelligence, Cognitive Psychology, Computer Science, Robotics,
or related disciplines.
The Cognitive Science degree is an integrated course taught by the departments
of Psychology and Computer Science. Research in Cognitive Science was highly
evaluated in the recent UFC research evaluation exercise, special areas of
interest being vision, speech, language, neural networks,
and learning. The
successful candidate will be expected to undertake research vigorously.
Supervision of programming projects will be required, hence considerable
experience with Lisp, Prolog, and/or C is essential.
It is expected that the appointment will be made on the Lecturer A scale
(13,400-18,576 pounds(uk) p.a.) according to age and experience but enquiries
from more experienced staff able to bring research resources are welcomed.
Informal enquiries to Professor John P Frisby 044-(0)742-826538 or e-mail
jpf@aivru.sheffield.ac.uk. Further particulars from the director of Personnel
Services, The University, Sheffield S10 2TN, UK, to whom all applications
including a cv and the names and addresses of three referees (6 copies of all
documents) should be sent by 1 April 1993.
Short-listed candidates will be invited to Sheffield for interview for which
travel expenses (within the UK only) will be funded.
Current permanent research staff in Cognitive Science at Sheffield include:
Prof John Frisby (visual psychophysics),
Prof John Mayhew (computer vision, robotics, neural networks)
Prof Yorik Wilks (natural language understanding)
Dr Phil Green (speech recognition)
Dr John Porrill (computer vision)
Dr Paul McKevitt (natural language understanding)
Dr Peter Scott (computer assisted learning)
Dr Rod Nicolson (human learning)
Dr Paul Dean (neuroscience, neural networks)
Mr Tony Prescott (neural networks, comparative cog sci)
------------------------------
Subject: Industrial Position in Artificial Intelligence and/or Neural Networks
From: Jerome Soller <soller@asylum.cs.utah.edu>
Date: Fri, 19 Feb 93 14:09:43 -0700
I have just been made aware of a job opening in artificial
intelligence and/or neural networks in southeast Ogden, UT. This
company maintains strong technical interaction with existing industrial,
U.S. government laboratory, and university strengths in Utah. Ogden
is a half hour to 45 minute drive from Salt Lake City, UT.
For further information, contact Dale Sanders at 801-625-8343 or
dsanders@bmd.trw.com . The full job description is listed below.
Sincerely,
Jerome Soller
U. of Utah Department of Computer Science
and VA Geriatric, Research, Education and
Clinical Center
Knowledge engineering and expert systems development. Requires
five years formal software development experience, including two years
expert systems development. Requires experience implementing
at least one working expert system. Requires familiarity with expert
systems development tools and DoD specification practices. Experience with
neural nets or fuzzy logic systems may qualify as equivalent experience
to expert systems development. Familiarity with Ada, C/C++, database design,
and probabilistic risk assessment strongly desired. Requires strong
communication and customer interface skills. Minimum degree: BS in
computer science, engineering, math, or physical science. M.S. or Ph.D.
preferred. U.S. Citizenship is required. Relocation funding is limited.
------------------------------
Subject: lectureship in cognitive science
From: Martin Cooke <M.Cooke@DCS.SHEFFIELD.AC.UK>
Date: Tue, 23 Feb 93 12:54:29 +0000
To Dan: thanks, and all the best for the auditory list.
To the list: a job possibility
Martin
- ------------------------------
LECTURESHIP IN COGNITIVE SCIENCE
University of Sheffield, UK.
Applications are invited for the above post tenable from 1st October
1993 for three years in the first instance but with expectation of
renewal. Preference will be given to candidates with a PhD in
Cognitive Science, Artificial Intelligence, Cognitive Psychology,
Computer Science, Robotics, or related disciplines.
The Cognitive Science degree is an integrated course taught by the
departments of Psychology and Computer Science. Research in Cognitive
Science was highly evaluated in the recent UFC research evaluation
exercise, special areas of interest being vision, speech, language,
neural networks, and learning. The successful candidate will be
expected to undertake research vigorously. Supervision of programming
projects will be required, hence considerable experience with Lisp,
Prolog, and/or C is essential.
It is expected that the appointment will be made on the Lecturer A
scale (13,400-18,576 pounds(uk) p.a.) according to age and experience
but enquiries from more experienced staff able to bring research
resources are welcomed.
Informal enquiries to Professor John P Frisby 044-(0)742-826538 or
e-mail jpf@aivru.sheffield.ac.uk. Further particulars from the
director of Personnel Services, The University, Sheffield S10 2TN,
UK, to whom all applications including a cv and the names and
addresses of three referees (6 copies of all documents) should be
sent by 1 April 1993.
Short-listed candidates will be invited to Sheffield for interview
for which travel expenses (within the UK only) will be funded.
Current permanent research staff in Cognitive Science at Sheffield
include:
Prof J P Frisby (visual psychophysics),
Prof J E W Mayhew *(computer vision, robotics, neural
networks)
Prof Y Wilks (natural language understanding, from June 93)
Dr P D Green (speech recognition)
Dr J Porrill (computer vision)
Dr P McKevitt (natural language understanding)
Dr P Scott (computer assisted learning)
Dr R I Nicolson (human learning)
Dr P Dean (neuroscience, neural networks)
Dr M P Cooke (auditory modelling)
Dr G J Brown (auditory modelling)
Mr A J Prescott (neural networks, comparative cog sci)
------------------------------
Subject: Microsoft Speech Research
From: Xuedong Huang <xueh@microsoft.com>
Date: Tue, 23 Feb 93 22:19:47 -0800
As you may know, I've started a new speech group here at Microsoft. For
your information, I have enclosed the full advertisement we have been
using to publicize the openings. If you are interested in joining MS,
I strongly encourage you to apply and we will look forward to following
up with you.
- ------------------------------------------------------------
THE FUTURE IS HERE.
Speech Recognition. Intuitive Graphical Interfaces.
Sophisticated User Agents. Advanced Operating Systems.
Robust Environments. World Class Applications.
Who's Pulling It All Together?
Microsoft. We're setting the stage for the future of
computing, building a world class research group and
leveraging a solid foundation of object based technology
and scalable operating systems.
What's more, we're extending the recognition
paradigm, employing advanced processor and RISC-based
architecture, and harnessing distributed networks to
connect users to worlds of information.
We want to see more than just our own software
running. We want to see a whole generation of users
realize the future of computing.
Realize your future with a position in our
Speech Recognition group.
Research Software Design Engineers, Speech Recognition.
Primary responsibilities include designing and developing
User Interface and systems level software for an advanced
speech recognition system. A minimum of 3 years demonstrated
microcomputer software design and development experience
in C is required. Knowledge of Windows programming, speech
recognition systems, hidden Markov model theory, statistics,
DSP, or user interface development is preferred. A BA/BS
in computer science or related discipline is required. An
advanced degree (MS or Ph.D.) in a related discipline is
preferred.
Researchers, Speech Recognition.
Primary responsibilities include research on stochastic
modeling techniques to be applied to an advanced speech
recognition system. A minimum of 4 years demonstrated
research excellence in the area of speech recognition
or spoken language understanding systems is required.
Knowledge of Windows and real-time C programming for
microcomputers, hidden Markov model theory, decoder
systems design, DSP, and spoken language understanding
is preferred. A MA/MS in CS or related discipline is
required. A PhD degree in CS, EE, or related discipline
is preferred.
Make The Most of Your Future.
At Microsoft, our technical leadership and strong
Software Developers and Researchers stay ahead of the
times, creating vision and turning it into reality.
To apply, send your resume and cover letter, noting
"ATTN: N5935-0223" to:
Surface:
Microsoft Recruiting
ATTN: N5935-0223
One Microsoft Way
Redmond, WA 98052-6399
Email:
ASCII ONLY
y-wait@microsoft.com.us
Microsoft is an equal opportunity employer working to
increase workforce diversity.
------------------------------
Subject: Neural Computation & Cognition: Opening for NN Programmer
From: gluck@pavlov.rutgers.edu (Mark Gluck)
Date: Mon, 22 Feb 93 08:04:28 -0500
POSITION AVAILABLE: NEURAL-NETWORK RESEARCH PROGRAMMER
At the Center for Neuroscience at Rutgers-Newark, we have an opening
for a full or part-time research programmer to assist in developing
neural-network simulations. The research involves integrated
experimental and theoretical analyses of the cognitive and neural bases
of learning and memory. The focus of this research is on understanding
the underlying neurobiological mechanisms for complex learning
behaviors in both animals and humans.
Substantial prior experience and understanding of neural-network
theories and algorithms is required. Applicants should have a high
level of programming experience (C or Pascal), and familiarity with
Macintosh and/or UNIX. Strong English-language communication and
writing skills are essential.
*** This position would be particularly appropriate for a graduating
college senior who seeks "hands-on" research experience prior to
graduate school in the cognitive, neural, or computational sciences ***
Applications are being accepted now for an immediate start-date or for
starting in June or September of this year. NOTE TO N. CALIF.
APPLICANTS: Interviews for applicants from the San Francisco/Silicon
Valley area will be conducted at Stanford in late March. The
Neuroscience Center is located 20 minutes outside of New York City in
northern New Jersey.
For further information, please send an email or hard-copy letter
describe your relevant background, experience, and career goals to:
______________________________________________________________________
Dr. Mark A. Gluck
Center for Molecular & Behavioral Neuroscience
Rutgers University
197 University Ave.
Newark, New Jersey 07102
Phone: (201) 648-1080 (Ext. 3221)
Fax: (201) 648-1272
Email: gluck@pavlov.rutgers.edu
------------------------------
End of Neuron Digest [Volume 11 Issue 15]
*****************************************