Final Presentations

May 3, 2010

in News

The winner (with all but 3 people being able to make it) is Thursday, May 13th, 10am-noon. I have reserved room 4310 for the presentations (if you haven’t been to 4310, it can be tricky to find). It’s much nicer than 1207.

The three people who could not make this time slot will get other arrangements. At least one (probably two) will give presentations on Thursday, May 6th.

Details on what you need to turn in will be discussed in class, and explained here soon.

We’ve made it to the end – 2 last lectures. Make sure you’ve noted your availability for final demos on the doodle poll.

  • Tuesday (May 4) – We’ll talk about some of the projects I am doing (probably molecular motion illustration) as a way of reviewing some of the things we went through this semester. We’ll also talk about the pragmatics of final projects. We’ll also do the official course evaluations. So please come to class to do those (they really do matter).
  • Thursday (May 6) – Final projects are officially due (we’ll talk about unofficial late deadlines on Tuesday in class). We’ll do the first of the project presentations (for people who prefer to do it at this time, rather than during the exam week time slot). And in the time left, we’ll summarize what we’ve done over the course of the semester.

There will be no reading assignments for this week, but you are expected to be reading for your projects. Details about how the projects will be turned in, and what’s expected will be posted soon.

Final Project Presentations

April 29, 2010

in News

Each project team will need to do a 10 minute presentation.

We could do this the last day of class (and run late), or we could do it some time during exam week.

If you don’t want to do it during exam week, you can do it the last day of class.

I have set up a doodle poll. Please specify your availability. Only check the last day of class if you prefer to do things then. Otherwise, select all of the times that you can make a presentation session the following week.

Please fill out the doodle poll ASAP.

The data is interesting, I’m not too keen on the circular design, but the information here is good food for thought:

http://www.informationisbeautiful.net/visualizations/colours-in-cultures/

First, a reminder that status reports for your final project are due on both the 23rd and 30th (both Fridays). It’s better to send it late than never. For a reminder on what’s expected, see the project description page.

Again, this week we’ll let people focus on readings for their project. Expect to be asked to talk about something you’ve read – its good practice to be able to explain a paper quickly.

In lecture, we’ll continue to talk about “traditional” scientific visualization.

Description

The project involves comparing climate observations from across North America to downscaled climate predictions for Wisconsin using 15 different climate models.  There are two time spans, 2046-2065 and 2081-2100, for each set of predictions.  The visualization will also take into consideration three different carbon emission scenarios.  The state of Wisconsin will be divided into cells (.1 degree latitude by .1 degree longitude) with an analog displayed somewhere within North America.

Desired outcomes

An interactive map and accompanying geo-visual analytic tool will be produced.  The visualizations will be designed for display online, with the data residing in a database accessible though standard web requests.

Reading List
Shneiderman, B. 1994. Dynamic Queries for Visual Information Seeking. IEEE Softw. 11, 6 (Nov. 1994), 70-77.

Eick, S. G. 1994. Data visualization sliders. In Proceedings of the 7th Annual ACM Symposium on User interface Software and Technology (Marina del Rey, California, United States, November 02 – 04, 1994). UIST ’94. ACM, New York, NY, 119-120.

Nakhimovsky, Y., Miller, A. T., Dimopoulos, T., and Siliski, M. 2010. Behind the scenes of google maps navigation: enabling actionable user feedback at scale. In Proceedings of the 28th of the international Conference Extended Abstracts on Human Factors in Computing Systems (Atlanta, Georgia, USA, April 10 – 15, 2010). CHI EA ’10. ACM, New York, NY, 3763-3768.

Timetable
Week 1: Gather all of the data from the Jack Williams lab and create a relational database for the different carbon emission scenarios, climate models and time spans.
Week 2: Create an interactive map capable of defining each cell within Wisconsin.  The map should have the same basic navigational capabilities as Google maps, Yahoo or Bing.
Week 3: Begin integrating the climate data with the maps of Wisconsin and North America.  Every cell on either map should have the capability of displaying the relevant data for a given location.
Week 4: Build a geo-visual analytic tool that shows the original Wisconsin data point and the distance to all of the related points for North America.  I’ll have a better idea what this will look like after working with the data.

Description
The final visualization will contain two maps side-by-side.  The map on the left will allow the user to navigate the state of Wisconsin in order to select a cell for comparison.  Once a cell has been selected, the map on the right will show all of the results in relation to the Wisconsin cell.  Lines will be draw between the original cell and all of the results, with the ability to mouse-over each line to display the distance and bearing.  Drop down menus will allow the user to select different scenarios, models and time spans for unique database queries.  I’ll add the ability to switch base tiles so that a user can evaluate the landscape using aerial photos, road networks or a hybrid of both.  A tool for analyzing the relationship between the original cell and each result will reside below the set of maps.

Description

The project involves comparing climate observations from across North America to downscaled climate predictions for Wisconsin using 15 different climate models. There are two time spans, 2046-2065 and 2081-2100, for each set of predictions. The visualization will also take into consideration three different carbon emission scenarios. The state of Wisconsin will be divided into cells (.1 degree latitude by .1 degree longitude) with an analog displayed somewhere within North America

Desired outcomes

An interactive map and accompanying geo-visual analytic tool will be produced. The visualizations will be designed for display online, with the data residing in a database accessible though standard web requests.

Reading List

Shneiderman, B. 1994. Dynamic Queries for Visual Information Seeking. IEEE Softw. 11, 6 (Nov. 1994), 70-77.

Eick, S. G. 1994. Data visualization sliders. In Proceedings of the 7th Annual ACM Symposium on User interface Software and Technology (Marina del Rey, California, United States, November 02 – 04, 1994). UIST ’94. ACM, New York, NY, 119-120.

Nakhimovsky, Y., Miller, A. T., Dimopoulos, T., and Siliski, M. 2010. Behind the scenes of google maps navigation: enabling actionable user feedback at scale. In Proceedings of the 28th of the international Conference Extended Abstracts on Human Factors in Computing Systems (Atlanta, Georgia, USA, April 10 – 15, 2010). CHI EA ’10. ACM, New York, NY, 3763-3768.

Timetable

Week 1: Gather all of the data from the Jack Williams lab and create a relational database for the different carbon emission scenarios, climate models and time spans.

Week 2: Create an interactive map capable of defining each cell within Wisconsin. The map should have the same basic navigational capabilities as Google maps, Yahoo or Bing.

Week 3: Begin integrating the climate data with the maps of Wisconsin and North America. Every cell on either map should have the capability of displaying the relevant data for a given location.

Week 4: Build a geo-visual analytic tool that shows the original Wisconsin data point and the distance to all of the related points for North America. I’ll have a better idea what this will look like after working with the data.

Description

The final visualization will contain two maps side-by-side. The map on the left will allow the user to navigate the state of Wisconsin in order to select a cell for comparison. Once a cell has been selected, the map on the right will show all of the results in relation to the Wisconsin cell. Lines will be draw between the original cell and all of the results, with the ability to mouse-over each line to display the distance and bearing. Drop down menus will allow the user to select different scenarios, models and time spans for unique database queries. I’ll add the ability to switch base tiles so that a user can evaluate the landscape using aerial photos, road networks or a hybrid of both. A tool for analyzing the relationship between the original cell and each result will reside below the set of maps.

What is Volume Data? (Scalar Fields)

What is a voxel (point samples, interpolation, reconstruction, …)

Hierarchy of Methods

  1. 2D Methods (slices) (note: not an X-Ray)
  2. Surface construction
  3. Direct Volume Rendering

General Graphics Points

  • Projection (orthographic vs. perspective)
  • Use of interaction (cutting planes, other “volume widgets”)

Surface Construction Approaches

  1. Cubies
  2. Contour tracking / connecting
  3. Marching Cubes

Applies when you are looking at distinct structures

Direct Volume Rendering

Transfer functions (definitions, basic concepts, issues)

  • Concepts of what can/cannot be done
  • Idea of classification, dealing with boundaries
  • Potential for “realism” interpretability
  • Volumes vs. Solids
  • Using normal as gradient (for lighting)
  • Using normal as boundary dection (for “surface” creation)
  • local vs. non-local

Basic projections (X-Rays model):

  • Maximum intensity projections
  • Accumulation through volume model
  • transparent volume model

Make sure everyone understands “volume rendering integral”

Basic Algorithms

  • Ray Casting
  • Splatting
  • Compositing
  • Shear-Warp
  • Fixed Slices 2D texture mapping)
  • Arbitrary planes (3D texture mapping)

Proxy geometry

Non-uniformity in sampling (correct for different ray lengths)

More on Transfer Functions

Two steps:

  • Data classification / feature identification
  • Optical properties

What to identify?

Materials (classification)

Boundaries / Geometric Features

Phenomena (fronts, structures, zones)

How to identify?

  • Manual segmentation
  • Automatic / Learning / …
  • Geometric features (edges) similar to 2D

Local vs. global decision making

Inputs to transfer function

Values

  • Gradients
  • Curvature
  • Feature info

Determining opacities

  • 0 in empty space – high “inside”
  • what about “murky regions” – not much to do, still need slicing and interaction

Try to have “thin shells” of “constant thickness” that are opaque (levoy)

  • value+gradient can identify boundary = gradient is important in figuring opacity

Still an active research area (transfer functions)

  • illustration inspired techniques
  • ways to simulate transparency and make perceptually useful transparency
  • integrating classification and automation in transfer function design
  • make different materials and their boundaries obvious

Adding Lighting

  • Have volume “emitting” light?
  • Fake lighting (use gradient as normal)
  • Direct lighting models
  • Global / Light Transport
  • Do the reverse of the rendering process to determine how much light gets to each voxel
  • More complex models require fancier integrals (over spheres, …)

Resources

http://www.vis.uni-stuttgart.de/vis03_tutorial/

http://www.siggraph.org/education/materials/HyperVis/vistech/volume/volume.htm

http://en.wikipedia.org/wiki/Volume_rendering

www.cse.ohio-state.edu/~hwshen/788/volume.ppt

This week in 838, we’ll let people stay focused on their projects. There will be no required readings, but people will be asked to talk in class about something they’ve read (we’ll pick people randomly – so be prepared to talk for 3-5 minutes about something you’ve read for your project!)

We’ll also talk about some of the more traditional forms of visualizations, starting with volumes.

http://www.papress.com/html/book.details.page.tpl?isbn=9781568987637

Topic:

I want to explore the use of textons, textures, color blending, and their combination as a means to clearly show relationships in a general Euler diagram.

Desired outcomes:

I want to develop a technique, or a set of techniques,  that can be used to style a Euler diagram so that the relationship of any subregion to the rest of the diagram is obvious and require little cognitive resources to understand. A part of this goal will be to create visually pleasing illustrations, and things such as color harmonies will be taken into account.

Initial reading list:

Cohen-Or, D., Sorkine, O., Gal, R., Leyvand, T., and Xu, Y. 2006. Color harmonization. ACM Trans. Graph. 25, 3 (Jul. 2006), 624-630. DOI= http://doi.acm.org/10.1145/1141911.1141933

Hagh-Shenas, H., Interrante, V., Healey, C., and Kim, S. 2006. Weaving versus blending: a quantitative assessment of the information carrying capacities of two alternative methods for conveying multivariate data with color. In Proceedings of the 3rd Symposium on Applied Perception in Graphics and Visualization (Boston, Massachusetts, July 28 – 29, 2006). APGV ’06, vol. 153. ACM, New York, NY, 164-164. DOI= http://doi.acm.org/10.1145/1140491.1140541

Papers on texture synthesis from: http://graphics.cs.cmu.edu/people/efros/research/synthesis.html

Time Table:

  • Week 1
    • Compile and begin reading the reading list
    • Visualize color harmonies in several color spaces
  • Week 2
    • Finish reading list
    • Complete harmony visualization
    • Formulate optimization framework for finding colors
    • Decide on an approach for texture synthesis
  • Week 3
    • Implement all approaches, including some combinations
    • perform Initial testing
    • Begin write up
  • Week 4
    • Finish write up
    • Prepare presentation
    • Additional testing

End result visualization:

Given a Euler diagram, the aim will be to color it using the approaches described in such a way that it can be understood easily. Something like this, but in a way that scales to more complicated relationships and regions