Talks
Visualization-driven science
By: Miguel A. Aragon-Calvo
Abstract: Data visualization is a great tool for presenting scientific results to colleagues and the general public and can also be used as a driver for scientific discovery. By combining state-of-the art visualization techniques and faster graphics hardware, experimental technologies like 3D printing, natural interfaces, virtual reality as well as recent advances in artificial intelligence we can gain insight on complex processes that resist traditional approaches. Visualization-driven science exploits the unmatched ability of our brain to identify complex patterns and find new relations in the data given a proper data representation. I will show several examples of how visualization-driven science can lead to often unexpected discoveries and the technology developed for this purpose.
Aesthetics & Astronomy
By: Kimberly Kowal Arcand
Abstract: Images have the potential to convey powerful messages to the public. In astronomy, images of space are particularly important as they can serve as powerful tools of communication, education, and inspiration. However, the public’s understanding and trust in the veracity of these images is paramount. In an age of easy manipulation of images, and frequent separation of the image from its context, the question arises as to what is perceived as “real” by those viewing images for science communications. Research that seeks to study the concept of "image as truth" in astronomy and related science communications, which also effects representation of such data in educational environments, is a critical component of any forward-looking strategic plan or vision. We argue it is essential in understanding how the astronomical community can use these images as vehicles to deliver the concepts of scientific discovery and the associated excitement to the widest possible audiences
Visualising hydrodynamics: 2D, 3D and VR
By: Pawel Biernacki
Abstract: Hydrodynamical simulations provide rich datasets, but can they be easily visualised? I will describe the development of movie capabilities in hydrodynamical grid code RAMSES, discuss capabilities and limitations of the in-house, in-code efforts. I will present few example results.
Audiovisual standards for astroviz
By: Lars Lindberg Lindberg Christensen
Abstract: The astroviz community relies on collaboration and sharing. Technical incompatibilities, however, constitute boundaries. To improve this situation, standards for various types of audiovisual assets have been defined and implemented:
- Astronomy Visualization Metadata (AVM) standard
- Data2Dome standard
- Fulldome Master Show File IPS Standard Adoption
- IMERSA AFDI Dome Master metadata standard
Co-Authors: Robert Hurt Mark Subbarao
Learning at a Glance
By: Nickolas Conant
Abstract: Certain topics in astronomy are taught conceptually and audiences are left to imagine the results. The digital era has provided sophisticated ways to teach these mentally burdensome principles. Now, we can show audiences and students of astronomy how stellar parallax, proper motion, and other previously tedious subjects work on astronomical scales. With a single glance, entire enigmatic concepts in astronomy can be learned.
Co-Authors: Dr. John Keller
Exploring Planetary Surfaces with NASA's Solar System Treks
By: Brian Hamilton Day
Abstract: NASA's Solar System Treks Project provides a suite of online, web-based visualization and analysis portals. These portals allow planetary scientists, mission planners, students, and the general public to explore the surfaces of a growing number of planetary bodies as seen through the eyes of many different instruments aboard a wide range of spacecraft. Visualizations and analyses generated with these portals can be used to tell diverse stories, including anticipated views and waypoints along planned traverses, examining and comparing surface landforms, and investigating dynamic processes altering planetary surfaces. In this talk, we'll present some example short stories with visualizations generated using Solar System Treks.
Co-Authors: Emily Law
When 2D isn’t enough and 3D is too much
By: Joseph DePasquale
Abstract: This year marks twenty-eight years since the launch of the Hubble Space Telescope. In recognition of this achievement, and in keeping with past tradition, the Office of Public Outreach at the Space Telescope Science Institute assembled an impressive package of visualization materials based on recent observations of the Lagoon Nebula made with HST’s Wide Field Camera 3 in both visible and near-infrared light. In preparing these materials, and also in keeping with past tradition, we had set a goal of producing a 3D “fly-through” video, but on a much shorter time-scale than we’ve had in the past. Given the time constraints and the nature of the source data, a full three-dimensional volumetric rendering of the Lagoon was not possible. We opted instead for a 3D decoupage (or as we like to think of it “2.5D”) zoom and pan treatment of the visible light image. I will discuss our thought processes in creating this visualization as well as highlight key decisions made in an effort to balance scientific integrity with a captivating visualization.
Co-Authors: G. Bacon, D. Player, F. Summers, Z. Levay
OpenSpace: A new NASA supported interactive viz engine for networking and STEM
By: Carter Emmart
Abstract: OpenSpace is a NASA SMD funded open-source interactive software development effort aimed toward planetariums, classrooms and personal use. Its focus is to visualize science for STEM needs, showing current data but also how it is gathered. Beginning with the American Museum of Natural History (AMNH) in collaboration with Sweden’s Linkoping University, this effort now involves the University of Utah and New York University. At its core, it is a visualization engine for the AMNH Digital Universe 3D Atlas, NASA NAIF-SPICE mission articulation coupled with image projection, GDAL based planetary globe browsing with multiple levels of terrain detail including rover based surface exploration, and space physics volumetric and field line rendering. OpenSpace supports 3D stereo, streaming to YouTube360, and multiple display synchronization with remote networking between multiple sites. Future works intends coordination with AAS supported Worldwide Telescope, and ESA Sky by use of imbedded HTML windowing.
Co-Authors: Carter Emmart, Rosamond Kinzler, Denton Ebel, Mordecai-Mark Mac Low, Vivian Trakinski, AMNH. Anders Ynnerman, Emil Axelsson, Kalle Bladin, Eric Broberg, Michal Marcinkowski, Linkoping University. Alexander Bock, Jonathas Costa, Claudio Silva, Tandon School of Engineering, New York University. Charles Hansen, Gene Payne, Matthew Territo, Scientific Computing and Imaging Institute, University of Utah. Maria Kuznetsova, Leila Mays, Asher Pembroke, NASA GSFC.
Visualizing Gaia DR2 Data
By: Jacqueline K Faherty
Abstract: On April 25, 2018, the European space agency (ESA) released the second catalog of the Gaia mission. Contained in these data are nearly 1.4 billion parallaxes and proper motions, over 7 million radial velocities, photometric data in Gaia’s three bands (G, R, and B), variability information, and effective temperatures for a subset of objects. The Gaia results provide a unique opportunity for astronomers and data visualizers. Stellar positions and velocities enable us to map the Milky Way and examine the dynamics of stellar streams, co-moving companions, hypervelocity stars, nearby moving groups, and solar system encounters. From a visualization perspective, real time rendering of Gaia data is a challenge. In this contributed presentation, we will show the results of our visualization efforts with the Gaia catalog at the American Museum of Natural History. The visuals generated for this talk isolate scientifically rich data and stories, which can lead to scientific discovery and will illuminate Gaia data for the general public.
Co-Authors: Jackie Faherty, AMNH Brian Abbott, AMNH Nathan Leigh, AMNH
WorldWide Telescope at the AAS
By: Jonathan E Fay
Abstract: WorldWide Telescope is a multi-platform astronomy, Earth and space visualization system the runs from a WebGL browser to immersive VR, and full dome projection. It is free and open source and has tools that allow for sharing of Visualization of images and animation thru an built-in authoring and playback system.
Originally create and funded by Microsoft Research, WWT how calls the American Astronomical Society home.
Co-Authors: Ron Gilchrist, Phil Rosenfield
Montage Image Processing and Visualization
By: John Conrad Good
Abstract: The Montage toolkit has, since 2003, been used to aggregate FITS images into mosaics for science analysis. It is now finding further application as an engine for image visualization. One important reason is that the functionality developed for creating mosaics is also valuable in image visualization. An equally important (though perhaps less obvious) reason is that Montage is portable and is built using standard astrophysics toolkits, making it very easy to integrate into new environments. Montage models and rectifies the sky background to a common level and thus reveals faint, diffuse features; it offers an adaptive image stretching method that preserves the dynamic range of a FITS image when represented in PNG format; it provides utilities for creating cutouts and downsampled versions of large images that can then be visualized on desktops or in browsers; it includes a fast reprojection algorithm intended specifically for visualization; and it resamples and reprojects images to a common grid for subsequent multi-wavelength visualization.
All projections are supported, included HEALPix and TOAST (WWT) and there are even tools for building WWT pyramids. For very large areas, tiling is supported while maintaining uniform backgrounds and scaling. Full-color composites can be made across disparate image sets and the visualization engine supports various overlays, drawn on the sky rather than in pixel space. Montage is available both as a library and a set of command-line tools and has been extensively used on everything from laptop workstations to parallel cloud workflows. It runs on Linux, Mac OS X and Windows and as a Python binary extension package. Montage was designed with science research in mind but has also been used for mission planning, pipeline mission data reduction, all-sky planetarium presentations and science center murals, and as an exemplar application for high-performance computing.
Co-Authors: G. Bruce Berriman
Firefly: An Interactive Web-Based Particle Visualization
By: Alex Gurvich
Abstract: Web based visualization applications are powerful tools to reach wide audiences both within and outside the scientific community. Where installation of a visualization software can present a barrier for some users, their accessibility makes them immediately approachable and greatly enhances the user experience. Interactive visualizations capture the attention of their users and give powerful intuition over the trends of the data. When designed well, they guide users through the important aspects of the data by offering a toolset that highlights the important qualities. WebGL provides a framework with which to do three dimensional scene rendering within the browser with builtin cross-platform compatibility. Modern browsers provide implementations of WebGL that allow the same code to be used without worrying about platform dependent software. As datasets grow in size, both in volume and hard drive space, as is the case with some modern numerical simulation data, it becomes untenable to share datasets in their entirety. Web based interactive visualizations can be used to explore, filter, and extract information from datasets that are too large to share through traditional means, enabling analysis and collaboration. We present Firefly, a web-based interactive visualization, built with WebGL and Javascript, that allows users to fly through 3D particle data customized with features suited for the FIRE galaxy formation simulation suite. Because at its core it is a webpage, its interface and toolset can be easily changed with basic knowledge of HTML and CSS— allowing it to be easily be refactored for different audiences. As scientific datasets and the prevalence of the internet continue to grow web-based visualizations will become a dominant force in the years to come.
Co-Authors: Aaron Geller
yt: A Toolkit for Astronomical Data Visualization
By: Cameron Hummels
Abstract: yt is a mature, open-source, python-based toolkit for analysis and visualization. Originally designed by researchers working with 3D astronomical simulation datasets, it has now grown to work with observational data and support for communities beyond astronomy. Of particular interest to the visualization community is the ability to work directly with scientific data and produce renderings and movies appropriate for flat screens, stereoscopic setups, and planetarium domes. I will summarize yt's capabilities and demonstrate some of its high-level viz products.
Co-Authors: Matt Turk, Nathan Goldbaum
Adapting TRAPPIST-1 for VR: Lessons We Are Learning
By: Robert Hurt
Abstract: The discovery of 7 Earth-sized worlds in the nearby TRAPPIST-1 system has captured the public imagination in a way that few science results have. Having developed a variety of visual assets to support the press releases on this system, it seemed natural to adapt them as a virtual reality experience. The development of this tour has led us to examine a great number of intriguing questions on just how VR can be employed effectively for science education/engagement/inspiration. TRAPPIST-1 is almost as interesting as a VR case study as it is a system of exoplanets.
Co-Authors: Keith Miller, Tim Pyle, Janice Lee, Gordon Squires (Caltech/IPAC)
Visualizing Webb Concepts: from Process to Product
By: Leah Hustak
Abstract: By taking abstract concepts and complex ideas and visualizing them in a clear and concise manner, they become more accessible to the public and the scientific community. This process involves iteration, communication, understanding, and cooperation between writers, artists, educators, and scientists to create visual assets that tell a story. Through this creative process, a story can be told on a scientific topic that is accurate, visually compelling, as well as less intimidating to the viewer.
During this presentation, I will share the process of creating a product about the James Webb Space Telescope’s field of regard from the perspective of an artist, and explain how a team works together to create a work about a topic that is hard to visualize without a point of reference and the technical details it involves. I will also share how a project goes from a script, through storyboards, draft animations, and voice-overs to a final product, and techniques to allow for information scaffolding without breaking the visual flow.
Touching the Stars: Lessons learned in 3D printing parts of the high energy Universe
By: April J Jubett
Abstract: Scientists, amateur astronomers, and communicators have been using Chandra's two-dimensional images, along with Chandra videos and podcasts, as an aid in discussing X-ray astronomy with non-experts since Chandra's launch in 1999. With new research techniques, however, we have begun mapping X-ray data (along with other multiwavelength data) of astronomical objects in three dimesions. Holding a 3D print can be an invaluable tool to learn about otherwise unreachable phenomena and data for people of all ages and of varying interests and abilities. In this talk, we will detail our lessons learned in 3D printing, from best practices for reaching blind and visually impaired audiences, to technical advice on print resolution and size, to our new adventures in powder-based printing in full color.
Co-Authors: April Jubett, Nancy Wolk, Kimberly Arcand
VR Experience with Solar System Treks
By: Emily Law
Abstract: Solar System Treks web portals allow users to enjoy immersive virtual reality experience of surface path traversing of planetary bodies. We will present the design and usage of this feature.
ViewSpace: Leveraging NASA Visualizations to Tell Compelling New Stories in Informal Learning Venues
By: Brandon Lawton
Abstract: NASA's Universe of Learning creates and delivers science-driven, audience-driven, learning-driven resources and experiences designed to engage and immerse learners of all ages and backgrounds in exploring the universe for themselves. The project is the result of a unique partnership between the Space Telescope Science Institute, Caltech/IPAC, Jet Propulsion Laboratory, Smithsonian Astrophysical Observatory, and Sonoma State University, and is one of 27 competitively-selected cooperative agreements within the NASA Science Mission Directorate Science Activation program. The NASA's Universe of Learning team draws upon cutting-edge science; works closely with Subject Matter Experts (scientists and engineers) from across the NASA Astrophysics Physics of the Cosmos, Cosmic Origins, and Exoplanet Exploration themes; and includes experts in education, visualization, writing, and design. As one example, NASA’s Universe of Learning program is uniquely able to empower informal learning venues with a direct connection to the science visualizations of NASA via the ViewSpace platform, which enables science learning and aims to advance scientific literacy via visual storytelling.
ViewSpace is a modular multimedia exhibit where people explore the latest discoveries, and the scientific process, in our quest to understand the universe. Hours of awe-inspiring video content connect users’ lives with an understanding of our planet and the wonders of the universe. The ViewSpace shows provide access to cutting-edge discoveries and imagery across astronomy and earth science. This experience is rooted in principles of informal learning, and each ViewSpace show is designed with specific learning objectives related to NASA Earth and space science themes, content, and how we know what we know. Scientists and educators are intimately involved in the production of ViewSpace material. ViewSpace engages visitors of varying backgrounds and experience at museums, science centers, planetariums, and libraries across the United States. In addition to creating content, the Universe of Learning team is updating the ViewSpace platform to provide for additional functionality, including the introduction of digital interactives to make ViewSpace a multi-modal learning experience.
During this presentation we will share the ViewSpace platform, explain the process by which we create new stories within ViewSpace by starting with learning goals and then leveraging visualizations and images created by NASA Astrophysics missions, and how a team of designers, scientists, educators, and writers work closely together to make sure that the stories we tell meet the needs of informal learning venues and their audiences.
Co-Authors: Timothy Rhue II (STScI) John Godfrey (STScI) Denise A. Smith (STScI) Gordon Squires (Caltech/IPAC) Anya Biferno (JPL) Kathleen Lestition (Smithsonian Astrophysical Observatory) Lynn Cominsky (Sonoma State University)
Data for storytelling in music
By: Shane A Myrbeck
Abstract: Abstract storytelling has long been part of instrumental music. Since the “program music” of the Romantic Era, we have taken the intellectual leap of faith that instrumental music can have an explicit narrative behind it. Data can be this narrative. The process of interpreting/sonifying select data sets can reveal patterns of storytelling previously unrevealed to our musical understanding. This talk will cover several projects that explore this, including the Orbit Pavilion at the Huntington Gardens. This piece spatially tracks several JPL satellite missions over a hemispherical speaker system that allows you to hear where they are located in real time. The composition evokes their missions, studying our land, sky and seas, while the form of the piece reveals their physical presence.
NASA's Museum Alliance
By: Jeffrey Nee
Abstract: NASA's Museum Alliance is a private, members-only site for museum professionals and NASA visualizers to find collaborators/distributors, share working assets, and get feedback. Come discuss how the Alliance can better serve you and the needs of your products and missions.
Public Conceptions and Misconceptions Regarding Astronomical Images
By: Travis Rector
Abstract: When presenting astronomical imagery to the public it is important to consider their preconceived notions and concerns. This includes questions regarding color, misunderstandings about how telescopes works, as well as some skepticism regarding the veracity of the images. When preparing images for public outreach it is important to be aware of the public's expectations and their level of understanding. In this talk I will describe common misconceptions and expectations that people often have. I will also describe effective methods for addressing these concerns so as to improve comprehension and confidence of astronomical imagery and the associated science.
Co-Authors: Kimberly Arcand Megan Watzke
Don’t Forget to Have Fun!
By: Morgan Rehnberg
Abstract: As computers become more powerful, it’s tempting to make ever more complex visuals with that capability. But will that help audiences understand our message? In this talk, I’ll share a few examples of how we used visuals to simplify our story in a recent production of the Noble Planetarium. By focusing on concepts over reality, we believe the end result is a broader reach and a more memorable experience.
Getting our virtual arms around virtual reality
By: Douglas Roberts
Abstract: Virtual Reality (VR) is now part of an alphabet soup (VR/AR/MR) of technologies that are people all over the world are exposed to via inexpensive technologies such as VR in Google Cardboard, phone-based Augmented Reality (AR). Content creators need to know how to create and where to publish mixed reality content. Additionally, planetariums, science centers as well as schools need to understand how to use these technologies to present that content to large potentially heterogeneous populations. I will discuss work we have done at the Fort Worth Museum of Science and History to bring these technologies to our guests and identify the most critical challenges we have faced.
Co-Authors: Morgan Rehnberg
SYSTEM Sounds - The Universe Through New Eyes, and With New Ears
By: Matt Russo
Abstract: SYSTEM Sounds is a science-art outreach project that translates the rhythms and harmonies of the cosmos into music and sound. From the remarkable resonances of TRAPPIST-1 to the wild waves in Saturn's rings, our work has reached a large audience who are eager to explore the universe through sound. In this talk I'll share some of our creations and describe how we're using sonification to help make astronomy more accessible for the visually impaired.
Co-Authors: Andrew Santaguida, Dan Tamayo
Astronomy Data Sonification
By: Greg Salvesen
Abstract: Data sonification — the use of sound to portray data — is a multi-dimensional medium that can complement traditional visual representations of a dataset and allow people with visual impairments to experience the excitement of astronomy. In this talk, I will show off a new website called Astronomy Sound of the Month (AstroSoM.com), highlight some astronomy data sonifications, and solicit collaborations with the astronomy visualization community.
Reimagining Astropix
By: Gordon Squires
Abstract: The "About" section of the Astropix web interface (astropix.ipac.caltech.edu) reads: " Welcome to a new 'one-stop shopping' experience that makes finding the right astronomy image easier than ever! AstroPix offers access to the public image galleries of many of the leading astronomical observatories under a single unified interface. Assets will include a full range of astronomical observations, artwork, and charts spanning the field of astronomy."
To me, this begs the question: "what is Astropix?" When we first imagined it, many years ago, there were dreams of being "the Google of astronomy imagery." It would solve the problem of searching for "M16" and finding pages of images of automatic weapons. It would feed WWT, planetarium shows and app development. It would become the authoritative aggregator of public astronomy imagery.
But has it?
In this discussion, I would like to reexamine our vision for Astropix. While I believe we remain committed to the concept, and remain convinced it is needed and valuable, I'd like to collectively consider what Astropix is, what it could be, and what it should be.
Co-Authors: Robert Hurt (Caltech/IPAC)
3D modeling and visualization with SHAPE
By: Wolfgang Steffen
Abstract: SHAPE is an interactive astrophysical 3D laboratory that works in a similar way as well known 3D animation packages such as Blender, 3D Studio Max or Maya. It uses 3D meshes and compressible hydrodynamics to generate images, 2D and 3D spectra, light curves and other types of output that can be used to compare with astrophysical observations. The software is not only suitable for astrophysical research, but photorealistic rendering allows its application for outreach as well. We will show a few samples of research and outreach applications and a short live demonstration of the software´s workflow.
Co-Authors: Nico Koning, University of Calgary, Canada
User Interface and User Experience in Virtual Reality: Changing Perspectives
By: Bryan Eric Stephenson
Abstract: Developing for Virtual Reality requires the knowledge and understanding of many different media types; and some of the same techniques used by traditional industry professionals can also benefit Virtual Reality developers. However, there are several factors that completely change the way we need to think about user experiences and user interfaces in an immersive environment. I will be discussing similarities and differences in UI and UX from other industries as they relate to Virtual Reality development. We will also take a look at some of the aspects of Virtual Reality that are new and require a completely different perspective.
Cosmic Pointillism
By: Frank Summers
Abstract: In the late 1800s, Georges Seurat was a leader of the particularly masochistic painting style called "pointillism". Drawing immense numbers of individual points on vast canvases, he created masterworks such as the 60-square-foot painting "A Sunday Afternoon on the Island of La Grande Jatte".
Today, with the help of computer graphics, drawing individual points is much easier and, for astronomy visualization, can provide a diverse range of uses. From stars to nebulae to galaxies, the individual splat and point cloud techniques can emulate a variety of looks with detail and precision. The use of millions of semi-transparent points allows for both emission and absorption with natural depth and resolution.
Standard general-purpose computer graphics packages can handle billions of particles (pixel-sized sprites), which makes them suitable for modeling astronomical objects at a distance. However, they have not yet provided adequate speed and precision for close-up point clouds. A custom C code, called pointillism, has been slowly developed and refined to handle this realm of astronomy visualization. I will discuss the applications, frustrations, limitations, and consummations of using this code for educational, press release, documentary, IMAX, planetarium, and virtual reality film sequences.
Co-Authors: Greg Bacon, Zolt Levay, Joseph DePasquale
STEM Stories: A Cognitive, Humanistic Perspective on Visual Storytelling
By: Michelle Viotti
Abstract: Effective visual storytelling depends on knowing how the human brain processes information and makes sense and meaning from static, dynamic, and immersive representations of facts and phenomena. The power of storytelling in memory and learning is also well documented, with recent neurobiological studies offering brain-based evidence of its effects. Combining "show" AND "tell" in the creation of visual STEM stories can contribute not only to STEM literacy and visual literacy, but also a return to wonder, curiosity, and deeper connections with the natural world.
Roundtable/Interactive
X-Particles - A powerful particle system for Cinema 4D
By: Luis Miguel Gonçalves Calçada
X-Particles is an advanced particle system plugin for Cinema 4D which features powerful data import options. It can be used to quickly import, display and render tens of thousands of point data from CSV files. Data can include position, rotation, velocity, color and more.
Space Engine - Free Universe Simulator and data visualization tool
By: Luis Miguel Gonçalves Calçada
Space Engine is a free versatile Universe simulator software developed Vladimir Romanyuk. It is capable of flat, fisheye, stereoscopic and VR rendering. It features both real database objects (Hipparcos and soon GAIA, Exoplanets, NEOs etc) as well as infinite procedurally generated. Additionally, objects and datasets can be imported and systems with realistic orbital elements can be created. The fulldome version is currently being developed in collaboration with ESO.
Cycles - GPU renderer for Cinema 4D with fulldome capabilities
By: Luis Miguel Gonçalves Calçada
Cycles is an unbiased GPU renderer for Cinema 4D, which provides near-realtime rendering capabilities. Cycles features a fulldome camera allowing for an interactive fisheye view inside Cinema4D.
The Community-driven Planetarium Content Archives at ESO and ESA/Hubble
By: Lars Lindberg Lindberg Christensen
ESO’s Education and Public Outreach Department (ePOD) has ramped up its production of planetarium content for our own use in the ESO Supernova Planetarium & Visitor Centre (supernova.eso.org). The planetarium community has shown a considerable support for ESO’s and ESA/Hubble’s free online distribution of fulldome clips and shows. Many members of the community have offered materials for distribution and the archives have grown to almost 500 individual clips and shows. The archives on eso.org and spacetelescope.org have been expanded to also host VR movies for download, as well as an extensive music archive and more. The free material is especially important for planetariums without large licensing budgets.
The ESO Supernova, status after opening
By: Lars Lindberg Lindberg Christensen
In 2013 ESO was presented with an interesting dilemma: a donation of world-class planetarium and science centre, but without the corresponding funding to operate it, just 250,000 euros/year from ESO. An inventive operational model was set up which integrated the centre in ESO outreach and streamlined all processes to enable an operation with minimal manpower (2 extra staff). I will tell the story and provide a status overview of how this all went after the official opening.
OpenSpace: Exploring the Universe of Astronomy Datasets
By: Carter Emmart
Hands-on demo of the many OpenSpace datasets.
Firefly: An interactive and web-based 3d visualization tool
By: Alex Gurvich
Firefly is a portable web-based 3d visualization platform for any 3d data (either coordinate or phase space) that requires (little to) no installation, all you need is a working python installation and to clone the Github repository and you’re done, it will work out of the box on any platform. The secret sauce is that the computation and visualization is all done in a web-browser (so we let Chrome/Firefox/Safari deal with the compatibility issues). This has the added benefit of letting you host your customized version of Firefly on the web (https://ageller.github.io/Firefly/) for anyone to access. You can also embed the Firefly canvas within a Jupyter notebook (for those of you that use them) and get immediate 3d visualization inline.
Blender as a Visualization Tool
By: Kevin Healy
The free animation package Blender includes many features that make it a powerful tool for astronomy visualization. Blender includes a Python environment with a library to access nearly every property within a scene. Blender provides many methods for importing data and using the data to control visual properties like color, density, or brightness. I will present an example of a 3d supernova simulation rendered in Blender.
FITS Liberator: Still Helping Get the Most out of Astro Images
By: Robert Hurt
The FITS Liberator is an application designed for one task: creating high-quality imagery from FITS data by offering exquisite control over dynamic range and contrast. It is still the go-to tool for many of us making public imagery despite the fact the code base has not been updated since 2012. The code is, however, open source and with luck could be adopted by developers looking for fun visualization projects that could benefit the art/education/science communities.
Interactive teaching with planetarium visualizations
By: Briana Ingermann
There are many fantastic ways in which visualizations are used in a planetarium to teach audiences, but the vast majority of them are passive. Are there ways to transform this space into a more active learning environment, where audience members have the chance to engage with the visuals in some way and have a hand in creating their own understanding? At Fiske Planetarium, we have had great positive feedback with using our dome capabilities to facilitate active learning undergraduate labs, and we are interested in developing this model for more topics and broader audiences. We are also interested to hear if there are ways in which other planetariums have started using visualizations to engage audiences in non-standard ways.
Sound Planetarium - Multisensory Displays of Astronomical Data
By: John. M Keller
Fiske Planetarium is partnering with Critical Media Practices Professor Tara Knight at CU Boulder to innovate and develop Sound Planetarium. Bridging the disciplines of media performance, audio installation, big data sonification, and astrophysics, Sound Planetarium is being designed as both an artistic instrument for orchestrating thousands of sounds that can be mapped to stars, as well as a tool for articulating data collected through astronomical research. During this round table discussion, new Planetarium Director John Keller will describe progress on this and other projects recently initiated at Fiske Planetarium.
Cesium - 3D Globe platform
By: Emily Law
Cesium is an open-source JavaScript library for 3D globes and maps. We will discuss why and how Solar System Treks web portals leverage this platform.
Solar System Treks - Interactive Visualization
By: Emily Law
Interactive visualization is one of the most impactful ways to communicate information. It is a great story telling tool. It enables intuitive understanding of data in a variety of visualization formats and media (e.g., Virtual Reality, Collaborative Visualization). It empowers users to explore data based on their inputs and actions. At the show and tell session, we will demonstrate and showcase interactive visualization and analytic features of Solar System Treks web portals including Moon Trek (https://moontrek.jpl.nasa.gov) and Mars Trek (https://marstrek.jpl.nasa.gov) in this session, including VR experience highlight. Co-Author: Brian Day
VR Visit to TRAPPIST-1
By: Keith Miller
Experience what you might see were you to visit the incredible TRAPPIST-1 system. Using art assets developed for the recent press releases, we have implemented an accurately-scaled fly-through of this system where the 7 Earth-sized planets orbit so close to one another that they are easily resolved in one another's skies. The VR experience is optimized for public event viewing by providing a memorable cinematic experience in a linear narrative, saving the learning curve overheads associated with interactive controls.
Interactive, Social Learning in immersive / VR / 360 environments
By: Jeffrey Nee
With the deluge of new online 360 content, how do you leverage those assets and resources to promote genuine, impactful learning in your dome? Come discuss how planetarium education can become even more relevant in this new age of Virtual Reality. We'll explore the questions of what old techniques still apply and what new techniques do we need to further engage and educate our audiences in the big, wide universe.
How Do We Know a Planetarium is Enhancing Learning?
By: Travis Rector
It would seem obvious that teaching astronomical topics in the fulldome environment naturally enhances learning over a normal classroom, but that is not necessarily the case. A planetarium's ability to do things that are unrealistic (e.g., move time backwards, travel at superluminal speeds) can create misconceptions. When teaching in the dome what steps should we take to make sure we're not doing more harm than good?
How to expand the reuse of WWT
By: Douglas Roberts
One of the powerful aspects of AAS WorldWide Telescope (WWT) is its ability to present the same data (for freeform exploration) and story (for narrative) using the same tool. In addition, the same WWT tour can be used to present to web, planetarium, video and virtual reality. Also WWT provides a way to share content among users. Given all of the ways that WWT can be used and is used, it would be good to talk over opportunities and obstacles for this re-use model.
Our Musical Universe
By: Matt Russo
Hear about the planetarium show that takes people of all sight levels on a tour of the rhythms and harmonies of the cosmos from the night sky to the edge of the observable universe.
The Hubble Ultra Deep Field in Light and Sound
By: Greg Salvesen
The Hubble Ultra-Deep field is among the most iconic astronomical images of all-time. By mapping sound to distance for each of the 10,000 galaxies in this puny patch of sky, I converted a two-dimensional image into a three-dimensional interactive experience. Thus, answering one of the most common questions asked by the public about astronomical images, "How far away is that?" Move your mouse over any of the 10,000 galaxies in the Hubble Ultra Deep Field image and a note will play telling you how far away it is from us — high notes are nearby, low notes are distant. What's the most distant galaxy you can find?
Make your own space music with SYSTEM Sounds
By: Andrew Santaguida
Try out our set of interactive web applets that let you play the rings of Saturn like harp strings, rev Jupiter's moons like a Ferrari, or compose a new song with the intricate rhythms and heavenly harmonies of TRAPPIST-1.
WebbVR: A Virtual Exploration of the James Webb Space Telescope
By: Chad Smith
WebbVR is a fully immersive, interactive experience focused on the James Webb Space Telescope and the science themes that Webb will be investigating. Users are able to explore and learn about Webb at Lagrange 2, travel throughout the Solar System, and dive into curated lessons involving the science related to protoplanetary disks and galaxies. This product is developed using the Unity gaming engine and is currently slated to be release for the HTC Vive, with Oculus support coming shortly.
The interactive virtual astrophysical laboratory SHAPE
By: Wolfgang Steffen
Brief demonstration of the general workflow of interactive model and visualization building in SHAPE.
Domecasting from Adler Planetarium
By: Mark SubbaRao
Planetariums have the opportunity to use new technologies and domecasting techniques to ignite curiosity and to engage broader audiences. Adler Planetarium is working with scientists to develop new domecast events that reveal the process of discovery. Using visualizations, domecasting, and VR, we are connecting with audiences in new ways. Learn how the planetarium community produces, implements, distributes, and receives real-time dome content for public engagement.
VR 360 Movies: Thinking Inside the Box
By: Frank Summers
Virtual Reality offers the opportunity to bring immersive, planetarium-style experiences to home audiences via web-based distribution. The mode of production for VR sequences includes all of the challenges of dome production and more. With full cube map rendering, there is, quite literally, no place for your flaws to hide! This presentation will discuss some complications and solutions encountered during our forays into VR 360 movies, and showcase the results.
AstroViz at the California Academy of Sciences
By: Ryan Jason Wyatt
Astronomy visualization provides the foundation for a variety of programming at the California Academy of Sciences, both inside and out of Morrison Planetarium. Our planetarium shows include pre-rendered astroviz-based storytelling, but we also offer a variety of planetarium programming—including our longstanding Benjamin Dean Astronomy Lecture Series—that uses real-time visualization software to engage audiences. In our stereoscopic theater and other venues, we take guests on more intimate, presenter-led voyages through data-based visualizations. And in the spirit of reducing, reusing, and recycling, we repackage and reutilize visualizations for online engagement with teachers and social media.