home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Usenet 1994 January
/
usenetsourcesnewsgroupsinfomagicjanuary1994.iso
/
sources
/
std_unix
/
reports
/
editorial
< prev
next >
Wrap
Text File
|
1991-09-06
|
10KB
|
242 lines
.\" Use -mm macros
.ds Ed Stephen R. Walli <stephe@usenix.org>
.ds Wd U\s-3SENIX\s0 Standards Watchdog Committee
.TL
The Five Great Myths of Open Systems Standards
.AF "\*(Ed, Report Editor"
.AU "\*(Wd"
.MT 4
.sp
.P
I recently read a column where the author described computer people
at cocktail parties as the doctors of the 90's.
Instead of everyone wanting to discuss their aches and pains with some
poor medical practitioner while they're trying to sip scotch and nibble
hors d'oeuvres,
computer people are plagued with the latest chat from computer literate
business people.
.P
No longer are you merely cornered by DOS know-it-alls,
now you get to deal with the sweeping issues of GUI Wars,
and whether UNIX will displace DOS on the desktop.
Open systems are in vogue.
Standards are ``sexy''.
.P
With all of this comes the new ``Open Systems'' know-it-all.
These are people who can spell POSIX,
but can't pronounce it.
They've all been taken to lunch recently by their favourite
marketing rep from one of those lavish companies whose name is
a regulation three letter acronym, let's call them TLA for short.
.P
I started discerning certain patterns in all of this idle gossip and chatter,
and now present to you the Five Great Myths of Open Systems Standards:
.P
.HU "Myth #1"
.P
``Vendor TLA IS the standard.''
This is the traditional mix-up between \fIde jure\fP standards,
and \fIde facto\fP standards.
Or REAL standards and market share.
De jure standards are built by accredited standards development bodies.
There is a fair process involved
to ensure all points of view are heard.
It is a consensus process,
not a majority one.
.P
De facto standards are mostly under the limited control of a single organization.
They are often trade marked.
If they are available at all outside of their controlling organization,
the technology is often licensed.
The holder of the license effectively controls where they want to take the
technology.
They accept input from some form of user constituency,
but ultimately they run the show.
I look at this as the difference between a POSIX standard interface,
and a UNIX operating system.
.P
.HU "Myth #2"
.P
``Vendor TLA is part of the standards development group,
and they're donating this technology to the standard.''
Always a knee slapper.
As if all it took to make a standard was for a vendor to donate part of its
technology,
obviously out of the goodness of its heart for mankind.
These people have not participated in the excitement of a Threads Wars,
or the current painful GUI Wars.
.P
Many vendors would love to have their specification as a standard.
It gives them an instant product to sell into the hot ``standards''
market.
They just have to get past the rest of the standards working group,
made up of various backgrounds and biases.
.P
Then comes a balloting group,
a superset of the working group.
These people haven't necessarily had the benefit of participating in the
discussions that led to a decision.
The popularity of publishing the rationale for decisions helps alleviate
this problem,
but not always.
There will always be people in a balloting group that know their solution is the
technically correct one.
It's a whole lot easier to disagree with the committee, balloting a draft
you didn't help make,
than in the working group sessions where the talking is done face to face.
.P
Other vendors don't \fIwant\fP their technology to be a public controlled
standard.
They lose control of their own specification.
If they have a large market share,
i.e. they're a de facto standard,
they may want nothing to do with becoming a de jure standard.
.P
.HU "Myth #3"
.P
``Vendor TLA sells a POSIX conforming system.''
Wrong.
No one sells a ``POSIX'' conforming system.
Indeed, POSIX conformance is the real myth here.
.P
POSIX.3 is a standard which defines the test methodology used to measure
conformance to POSIX.
It has recently become a standard, IEEE 1003.3-1991.
An accompanying document,
still in the balloting process and therefore unstable,
is POSIX.3.1.
This document contains the test methods themselves for POSIX.1,
(the base system interface standard),
which everyone refers to as ``POSIX''.
.P
By definition,
POSIX.3.1 is not yet a standard,
hence no POSIX.1 conformance test suite actually exists.
.P
There is a United States government procurement profile of POSIX.1 called
FIPS 151-1,
or in today's open systems circus,
simply ``THE FIPS.''
FIPS 151-1 chooses certain options within the standard.
It even defines certain behaviour that in the standard is left as
implementation defined.
It was written against the original POSIX.1 standard,
IEEE 1003.1-1988,
not the current one, (IEEE 1003.1-1990.)
In fact it was written prior to the completion of the standard.
.P
In theory,
nothing changed in POSIX.1,
between 1988 and 1990,
except for the reformatting to make it ISO acceptable,
and ``bug fixes''.
The removal of \fIcuserid()\fP was a ``bug fix''.
.P
Because of the obvious buying power of the U.S. government,
most major vendors are implementing FIPS 151-1.
It is a profile or subset of POSIX.1.
.P
Test suites exist to test conformance against FIPS 151-1.
These must use the test methods described in POSIX.3.1 (still in ballot.)
One of them was written to an early draft of POSIX.3.1.
Another was written by using the AT&T UNIX System V Verification Suite (SVVS)
as a base.
SVVS dependencies are still being discovered and weeded out of this one.
It is quite possible to implement something different from the FIPS,
which would fail the FIPS test suites miserably,
yet would technically conform to the standard.
(If only there was a way to prove it.)
.P
.HU "Myth #4"
.P
``POSIX isn't important \(em it's source code portability that's important.''
Well, no and yes.
One vendor is notorious for this game.
.P
Yes, absolutely, source code portability is what it's all about.
This is typically one of the banners that's waved around
in many people's definitions of open systems.
.P
POSIX is a family of standards designed to provide source code portability.
The interface was derived from the many UNIX system interfaces that existed.
UNIX was/is a de facto operating system in many arenas.
Many vendors are implementing the POSIX interface on their non-UNIX derivative
proprietary operating systems.
.P
No, POSIX is not UNIX.
Many UNIX developers mourn and despise what has happened to the UNIX
interface.
They shouldn't.
First of all,
the base technology,
which is close enough that they are already familiar with it,
is becoming available on a huge installed base of technology.
The demand will far outstrip the supply of technologists familiar with it.
Second, nothing is preventing them from continuing on in their current
preferred environment.
It is different enough that they can continue developing software as
they always have.
It's just not as portable.
.P
There are other software development environments which ensure
software portability.
VMS on a VAX architecture guarantees portability of source (and executables)
across the entire line of VAX hardware.
This is fine if that's where your business lays.
Likewise, IBM's SAA will provide similar source portability benefits across
disparate IBM architectures.
They're really muddying the waters by also implementing some of the other
``open system'' interfaces on the SAA platforms.
Again, it all depends where you as a software developer want to draw the
portability line.
POSIX is becoming the path to widest portability.
.P
.HU "Myth #5"
.P
``Open systems technologies will revolutionize the way software is
developed.''
Yet another silver bullet contestant.
Everyone remember the marketing hype around 4GLs? CASE?
These are all good useful technologies.
They simply need to be applied in their proper forum.
They do not remove the responsibility of thought,
i.e. creative design, careful development, and inventive testing
of a problem's solution.
.P
The current ``promise'' of open systems technologies has us living
in a completely networked corporation of resources.
Applications running where the optimal appropriate processing resource is.
Information available everywhere at once,
both properly protected and with its true location completely irrelevant.
All of it interfaced via some wonderful intuitive graphic user interface.
.P
I do believe this is where we're going.
The technology is often commercially available already,
but with some very real constraints on it.
Often these constraints involve how new the technology is,
and the lack of standardization.
.P
It is a great vision,
but before it's available in completely heterogeneous networked environments,
the technology has to stabilize enough for standards to be created.
No matter how dazzling the technology seems to be,
a standard cannot be wrestled on to it too early,
or it becomes a straight jacket on the creative forces shaping it.
.P
Networked system administration at this level is in its infancy.
A corporation's information and application architecture is often
weighted down in a heavy history of legacy systems.
(That's if the corporation can even draw its architectures!)
These are a couple of the ``minor'' problems that need to be dealt with
before marketing sells the ``promise'' too fully.
.P
.HU "Conclusions"
.P
So there they are.
My five favourite myths of open systems standards.
I'm sure this is just the beginning.
(I don't get to a lot of cocktail parties. I have small children.)
.P
I'd love to hear other additions to this.
No matter how outrageous.