Towards an Astrophysical Cyberspace: The Evolution of User Interfaces
Towards an Astrophysical Cyberspace:
The Evolution of User Interfaces
This is is a preprint and is subject to change.
The completed paper will be published in the
ADASS Conference Proceedings.
Our thesis is that trends in
graphical user interfaces ( e.g. NCSA's Mosaic;
& networks & distributed computing
( e.g. information discovery systems like gopher, WAIS,
& hypermedia networks like The WWW,
virtual reality technology,
will synergize a powerful astrophysics environment.
In sum, then, it would be unwise to ignore the design of cyberspace
itself while we are engaged in the myriad considerations of particular
GUI and VR implementations. The design of cyberspace is, after all, the
design of another life-world, a parallel universe, offering the
intoxicating prospect of actually fulfilling -- with a technology very
nearly achieved -- a dream thousands of years old: the dream of
transcending the physical world, fully alive, at will, to dwell in some
Beyond -- to be empowered or enlightened there, alone or with others,
and to return.
Although the `desktop' metaphor has served tolerably well
for working with the 2d world of workstation/PC screens,
its certainly not very appropriate for interacting with 3d worlds.
Recent research in human-computer interaction has spurred the
development of new peripherals for interacting with
Instead of keyboard & mouse input, use is made of voice,
& hand gesture & manipulation;
output is to displays aimed at creating the illusion of 3d,
simulating operator presence inside these worlds.
We are accustomed to thinking of user interfaces in terms of screens,
& relatively local applications.
Our user conducts a dialog with our application through the mechanisms
we provide; at their simplest,
these are just reads & writes built into our
chosen programming language.
With the advent of bit-mapped graphical workstations,
seamless network integration,
& several other exciting new
technologies, we are going to witness a profound change in this
The user interface mediates in the processes
of data acquisition,
manipulation, navigation, analysis, archiving, visualization,
& in the collaboration with others.
The trend is to increasing transparency,
so that the user interface will become less & less of an obvious
intermediary, & more of a `looking glass' into astrophysical
datascapes & cyberspaces.
The user interface must place the scientist in direct contact
with these processes,
without superimposing its own idiosyncracies.
To this end,
our traditional view of the user interface as screen, keyboard,
intervening between the user & application will evolve to accomodate
emerging technologies & concepts, especially
graphical user interfaces & GUI builders,
hypertext & multimedia, virtual reality, & cyberspace.
Graphical user interfaces are an improvement over command
languages, but the next generation of user interfaces is on the way.
Future GUI's will be dynamic, spatial, 3-dimensional, virtual,
pervasive, gestural, colorful, frequently auditory,
& sometimes immersive.
offer a large repertoire of user interaction devices,
allowing the designer to present information & accept user input,
with structures that match application & ergonomic needs most
e.g. many applications have a hierarchical command
structure -- with pull-down menus, you can readily see the overall
These devices are well-illustrated in
HEASARC's forthcoming StarTrax
It is a shared information universe.
It is the network of interacting computers & their data,
& their users.
The World Wide Web, NCSA's Mosaic,
hypermedia, the Internet.
VR & cyberspace are not synonyms.
VR has to do with visualization & manipulation;
cyberspace with navigation & collaboration.
One day, VR may provide the most effective portal to cyberspace.
The World Wide Web,
a distributed hypertext-based information system developed at CERN,
is a globally interconnected network of hypermedia information
the Internet, +
a protocol for transmission of hypermedia documents, +
a set of servers that respond to requests from browsers
(or clients) for those documents.
Hypermedia (or, more loosely, hypertext) documents
contain hyperlinks to other documents, anywhere on the Web.
A hyperlink is a segment of text, or an inline image that
refers to another document (text, sound, image, movie) elsewhere on the
When a hyperlink is selected, the referenced document
is fetched from the Internet, and is displayed appropriately.
NCSA Mosaic is a multi-platform GUI hypermedia browser that
helps you to find, fetch, & display documents & data from the Internet.
Although there have been browsers for the Web since 1991, Mosaic shot
rapidly to prominence because it seamlessly integrates a great deal
of useful functionality in a very pleasant & easy-to-use user interface.
Incorporating Web, gopher, WAIS, NNTP, & FTP protocols,
NCSA Mosaic talks natively to these servers, and can gateway to others
can be difficult.
As the use of hypermedia expands,
the navigation problems will become more evident.
The spatial & structural connections which may exist inside hypermedia,
are not well represented by the desktop metaphor & the associated tools.
New metaphors for hyper space, e.g. `cities',
will provide users with a uniform visual portrayal of such structures.
The technologies known as `virtual reality' will include text either
directly as documents or
indirectly as menus or organizational structures.
These text objects will be presented in stereoscopic 3D,
due to the nature of the display technologies used.
The Navigating in Information Space project (NCSA),
is mapping information -- textual data -- in three dimensions
for navigation using virtual reality technologies.
The information is processed statistically to extract relations among
terms or documents in a database.
Items which are more closely related are placed closer to each other in
the `information space'.
The user employs the DataGlove or mouse and the NCSA VR lab's
projection screen to display and move through the 3D information space.
Users can reach out and `grab' the documents of interest.
refers to the illusion of being inside a computer-generated world.
The user must be persuaded to suspend disbelief in the
reality of what is being observed -- until he/she can ignore the
interface, & concentrate on the application, VR will remain a novelty
rather than a genuine scientific visualization tool.
refers to the ability to `move around' the virtual world, e.g.
to examine large data sets containing spatial & temporal information.
The user can employ experience to comprehend the data,
by exploring it in the same way as physical space.
This enhances the researcher's perception of its characteristics.
often studies multi-dimensional physical phenomena;
in these systems, a computer model generates data representing the
behaviour of the model.
VR can help scientists explore the multi-dimensional graphical
representation of their data at various levels of scale.
spectral-line data cubes can be viewed interactively as
three-dimensional cubes, or even walked through in a virtual reality
system (Norris 1993).
You can use virtual reality technology to fly through 3 dimensional
environments that represent extremely complex data, &
reach out & manipulate these representations with your hands
These peripherals engage visual, auditory, tactile, & haptic (muscular)
to support user interaction with computer applications & resources.
Commercially available systems support hand-tracking, head-mounted
stereo displays, 3d audio, speech recognition & synthesis.
Research is ongoing on force-feedback, tactile gloves, & eye-tracking.
Head-mounted display (HMDs)
typically weighing 4lbs,
are perhaps the most popular (or well-known) VR interface,
comprising a stereo pair of small displays covering the eyes
(e.g. VPL's EyePhone, or `facesucker').
like the Polhemus Isotrack, & Ascension Technology's Flock of Birds
allows the computer to determine the location & orientation of a person.
A head-tracking device provides location & orientation of the viewer to
simulate the user's viewpoint.
VPL's DataGlove tracks hand & finger movements.
The Binocular Omni Orientation Monitor
(Fake Space Labs)
places small CRT displays (2.5" x 2.5"; 1280x500 color pixels)
before the eyes.
The BOOM is suspended from an articulated arm,
which measures precisely its position & orientation in space,
& counterbalances its mass -- when you let go it remains there.
The user looks into it & moves it by handles.
The Harvard-Smithsonian Center for Astrophysics
& NCSA have used virtual reality to explore astronomical data in
The Redshift Survey.
Their principal objective was to map the positions of all the galaxies
within about 500 ly. Seeing all the data on a 2d medium is almost
meaningless. But looked at in VR, patterns start to become clear.
The researchers (Margaret Geller & John Huchra) used a BOOM connected
to a high-power graphics workstation. This enabled them to view
the galaxy structures from any angle or perspective, giving insight
into the structural layout & placement of the galaxies.
This method of visualization significantly enhances the
traditional methods of statistical analysis (Aukstakalnis 1992).
Electronic Visualization Laboratory (EVL)
at the University of Illinois at Chicago,
also collaborates with the NCSA on a
number of scientific visualization projects, e.g. in cosmology.
is a room built from large displays on which the graphics are
projected on three walls & the floor, to be viewed with stereo
As a viewer wearing
a head & hand tracking system
moves inside its display area,
the stereo perspectives of the
environment are updated,
& the image moves with & encloses the viewer.
The Cosmic Explorer, a CAVE application, is a research
tool for exploring the stages of the evolution of the Universe.
It visualizes the result of numerical simulations to allow the
exploration of the formation of the universe, astrophysical jets,
& colliding galaxies.
Thanks to Nick White for encouragement & the opportunity to speculate a
- Aukstakalnis, S, Blatner, D, 1992,
Silicon Mirage: The Art & Science of Virtual Reality,
Peachpit Press Inc.
- Benedikt, M., ed., 1992,
Cyberspace: First Steps,
- Hardin, J. 1993,
Human Collaboration Technologies for the Internet --
NCSA Mosaic & NCSA Collage,
- Norris, R. P., 1993,
The Challenge of Astronomical Visualisation,
- Richmond, A., et al. 1993,
StarTrax -- The Next Generation User Interface,