Alfredo Covaleda,
Bogota, Colombia
Stephen Guerin,
Santa Fe, New Mexico, USA
James A. Trostle,
Trinity College, Hartford, Connecticut, USA
We've been pushing to get news sites to appreciate — and employ — the value of developing timelines for a couple years now (and have the rejected grant proposals to show for it). But thanks to Nathan at FlowingData, we now have an example of what's at hand.
Timelines, much like calendars, can be used to show changes over time in a straightforward way. When you have a bunch of events that occurred at certain times, mark them on a timeline, and you quickly get a sense of what's going on. Take the timeline of 10 largest data breaches for example. You see breaches get more dense as time goes by.
Wrap this idea into web application form, and you get Dippity. There have been similar timeline applications, but Dippity does it a bit better with a primary focus on telling stories with timelines and a good interface. Zoom in, zoom out, drag, and get alternative views as flipbook, list, and map.
Below is a little bit of context to my gas price chart. Check out the full version for a better idea of what Dippity offers.
[Thanks, Canna]
A new version of Flare, the data visualization toolkit for Actionscript (which means it runs in Flash), was just released yesterday with a number of major improvements from the previous version. The toolkit was created and is maintained by the UC Berkeley Visualization Lab and was one of the first bits of Actionscript that I got my hands on. The effort-to-output ratio was pretty satisfying, so if you want to learn Acitonscript for data visualization, check out Flare. The tutorial is a good place to start.
Here are some sample applications created with Flare:
This week, Steve Duenes, graphics director for The New York Times, is answering readers' questions along the lines of “How did you do that?” “Why did you do that?” and “What are the NYT's designers proud of?”
The NYTimes is arguably the best in the mainstream media at delivering infographics in the broadest definition of the term and in both print and on the web. (Yeah, WIRED does a good job, too, but with longer lead-times.) Its 30 or so designer-journalists can produce illustrations of the scene of news stories to visualizations of large data sets, i.e. a day's results the the world's major stock markets.
Check out “Talk to the Newsroom: Graphics Director Steve Duenes” at http://tinyurl.com/2qb6z8
This is a relatively new bit of fine creation and code that has display potential for many data rich files. To fully appreciate it, be sure to go to the bottom of this page, and download and run the Flash presentation.
Voyagers and Voyeurs: Supporting Asynchronous Collaborative Information Visualization
Jeffrey Heer, Fernanda B. Viégas, Martin Wattenberg
Abstract
This paper describes mechanisms for asynchronous collaboration in the context of information visualization, recasting visualizations as not just analytic tools, but social spaces. We contribute the design and implementation of sense.us, a web site supporting asynchronous collaboration across a variety of visualization types. The site supports view sharing, discussion, graphical annotation, and social navigation and includes novel interaction elements. We report the results of user studies of the system, observing emergent patterns of social data analysis, including cycles of observation and hypothesis, and the complementary roles of social navigation and data-driven exploration.
The sense.us collaborative visualization system. (a) An interactive visualization applet, with a graphical annotation for the currently selected comment. The visualization is a stacked time-series visualization of the U.S. labor force, broken down by gender. Here the percentage of the work force in military jobs is shown. (b) A set of graphical annotation tools. (c) A bookmark trail of saved views. (d) Text-entry field for adding comments. Bookmarks can be dragged onto the text field to add a link to that view in the comment. (e) Threaded comments attached to the current view. (f) URL for the current state of the application. The URL is updated automatically as the visualization state changes.
Research Paper
PDF (998K)
Video Figure
Flash (20M)
ACM Human Factors in Computing Systems (CHI), 2007
PDF (998K) | Flash Video (20M)
We've been a fan of the dashbroad approach for a long time because dashboard graphics can give readers a quick snapshot of multiple sets of dynamic data. Charley Kyd, who studied journalism some years back, has developed a nifty plug-and-play package — Dashbroad Kit #1 — to generate these. And below is a recent and relevant posting from Jorge Camoes that gives us some good tips on the topic.
10 tips to improve your Excel dashboard
Posted: 26 Jan 2008 06:42 PM CST
Excel is a great (but underrated) BI tool. Several BI vendors gave up fighting it and offer Excel add-ins as front-ends for their BI solutions. So, if you want to create a dashboard you should consider Excel, since it really offers better functionalities than many other applications for a fraction of the cost and development time. I know that Excel is not a one-size-fits-all solution, but first you should be sure that your requirements are not met by Excel. Let me share with you some random tips from my experience with the Demographic Dashboard.
But, shouldn’t I just ask my IT to create the dashboard? This is a fact: many IT departments hate Excel. The IT spends millions in BI solutions and users keep using Excel. Why? Because they know it, they like it, they feel in control and can do what ever they want with the data. Ask your BI manager to replicate the image above using an expensive BI solution and he’ll come back six month later with something you didn’t ask for, to answer a need you don’t have anymore (I know, I’m oversimplifying…). Do you know Master Foo Defines Enterprise Data?
1. Go to the point, solve a business need So, you have your idea for a dashboard, you’ve discuss the project it with the users (right?) and you are ready. But where to start? Remember this: a graph, a table, the entire dashboard, are merely instrumental to solve a business need. It’s about insights, not about data, not about design.
2. Don’t use formulas Yes, I know, this is Excel, and it is supposed to have formulas. What I am telling you is that you should aim at minimizing the number of independent formulas, and this should be a fundamental constraint to your global strategy. Too often I see Excel used as a database application. It is not, it is a spreadsheet (not everyone finds this obvious).
Over the years I had my share of “spreadsheet hell”: a lookup formula in the middle of nowhere would reference a wrong range for no apparent reason. An update cycle adds a new column and suddenly there are errors all over the place. You leave the project for a week and when you come back you don’t know what all those formulas mean. Even if everything goes smoothly the auditing dep wants to trace every single result.
But how do you minimize the use of formulas? If your data table resides in an Excel sheet you’ll have to rely heavily on lookup formulas, and that’s one of the highways to spreadsheet hell. Instead, get the data from an external source (access, OLAP cube…) and bring data into Excel. Calculations should be performed at the source. After removing all the formulas you can, the remaining should be as clear as possible.
3. Abuse Pivot Tables Every object (graph, table) in the Demographic Dashboard is linked to a pivot table. Let me give you an example. One of the charts shows population growth over the years, using 1996 as reference. Pivot tables can calculate that directly, I don’t need to add a new layer of complexity by using formulas (to calculate the actual values and look up formulas to get them).
The population table has 200,000 records, so I coundn’d fit into the Excel limit of 65 thousand rows (yes, that’s changed in Excel 2007, but it is debatable if a table with a million rows in a spreadsheet application can be considered good practice). By using a pivot table I can overcome that limit.
4. Use named ranges To be able to use self-document formulas (”=sales-costs” is much simpler to understand than “=$D$42-$F$55″) is one of several uses of named ranges. But they are also the building blocks of interaction with the user and they make your Excel dashboard more robust.
5. Use as many sheets as you need, or more You don’t have to pay for each additional sheet you use in a workbook, so use as many as you need. Each cell in your dashboard report sheet should point to some other sheet where you actually perform the calculations. You should have at least three groups of sheets: a sheet with the dashboard report itself, sheets with the base data and other group with supporting data, definitions, parameters, etc. Add also a glossary sheet and a help sheet.
6. Use autoshapes as placeholders Once you know what you need, start playing with the dashboard sheet. Use autoshapes to test alternative layouts or, better yet, use real objects (charts, tables…) linked to some dummy data.
7. Get rid of junk There are two ways to wow your users: by designing a dashboard that actually answer needs, or by planting gauges and pie charts all over the place (this one can guarantee you a promotion in some dubious workplaces, but it will not help you in the long run). In the series on Xcelsius Dashboards you can see how difficult is to create something beyond the most basic and irrelevant charts.
So, get rid of Excel defaults (take a look at this before/after example) and just try to make your dashboard as clean and clear as possible. You’ll find many tips around here to improve your charts, so I’ll not repeat myself.
8. Do you really need that extra-large chart? Charts are usually larger than they should. What it really matters in a chart is the pattern, not the individual values, and that can be seen even with a very small chart.
9. Implement some level of interaction A dashboard is not an exploratory tool, is something that should give you a clear picture of what is going on. But I believe that at least a basic level of interactions should be provided. User like to play with the tools and can they learn a lot more than just looking at some static image.
10. Document your work Please, please, structure and document your workbook. Excel is a very flexible environment, but with flexibility comes responsibility… I am not a very organized person myself, but from time to time I try the tourist point of view: I pretend I never saw that file in my life and I’ll try to understand it. If I can’t or takes me too long, either I must redesign it or write a document that explains the basic structure and flow.
Bonus tip: there is always something missing… Once you have a prototype, user will come up with new ideas. Some of them can be implemented, others will ruin your project and if you accept them you’ll have to restart from scratch. So, make sure the specifications are understood and approved and the consequences of a radical change are clear.
This is far too incomplete, but I’ll try to improve it. Will you help? Do you have good tips specific to the design of Excel dashboards? Please share them in the comments.
Who says radio can't do stories on something as image-rich as maps. See this from NPR: http://www.npr.org/templates/story/story.php?storyId=17173936&ps=bb2
Listen Now [16 min 56 sec] add to playlist
Talk of the Nation, December 12, 2007 · Vincent Virga's Cartographia is a rare collection of 250 color maps and illustrations drawn from the world's largest cartographic collection at the Library of Congress. The collection spans everything from maps of ancient Mesopotamia, to maps of Columbus' discoveries, to contemporary satellite images and maps of the human genome.
Virga says that maps are like time machines — they reveal as much about the society that created them as they do about the geography of the places they describe.
Virga discusses the collection, which he culled from the Library of Congress' millions of maps and tens of thousands of atlases.
“Maps always have and always will help us communicate our physical, mental, and spiritual journeys,” Virga says.
Last week, O'Reilly's Radar posted an interesting account of a project to scan historic photos of Philadelphia and link them to Google Maps. Hence, the reader can see the pic and then relate it to the photo's original location. Most newspapers have photo archives. Many of these shots are not just of people, but events which have a geographic location. It might be difficult to tie a picture with a specific location, but some might be possible. So why don't newspapers start scanning those photos and put them on the paper's web site, a la “Mapping Philly”? Doing so builds a reporter's sense of place in the community's timeline, the photos will attract a certain audience to the web site (and that could then reflect specific advertisers) and the photos would be preserved by the scanning.
Yes, it would require an investment in time and money, but hey, instead of just cutting expenses by laying off staff, how 'bout a little investment in the future of the enterprise?
Source: http://radar.oreilly.com/archives/2007/11/mapping_philly.html
Mapping Philly
Posted: 08 Nov 2007 06:12 PM CST
By Peter Brantley
One of the most engaging sessions at the Digital Library Federation Fall Forum meeting in Philadelphia this week was a panel discussing a georeference-supportive project from the City of Philadelphia itself. We were thrilled to have representatives from Philadelphia's Department of Records, who have been gradually developing a project called PhillyHistory.org with several technology partners including Avencia, a firm in Philadelphia; it is Avencia's presentation [pdf] that I highlight in this entry.
The Department of Records in Philadelphia has one of the best historical image archives in the country, with over two million photographs. To date, some 47,000 pictures have been digitized, with descriptive metadata; the Department is digitizing photos at a rate of approximately 2000 each month. The most critical information associated with the images are locational data that facilitate mapping and georeference services.
An image search can be delimited by time period and location, and relevant results are returned as thumbnails with brief descriptions. Advanced search operations on many other metadata fields are also available. Location based searches are mapped, and presented as a tile on a nearest-to-furtherest scale. Clicking on an image's descriptive information will provide a screen of detailed metadata, and clicking the image itself produces a higher resolution version of the picture.
The most attractive features of the site are social; images can be shared with others (via email, right now, although theoretically it would be possible to export out to other social environments or provide internal community social site features, such as neighborhood blogs). Images can also be collected in a Favorites list.
PhillyHistory also has a mobile interface, so one of the things that I've most wanted to see in a metropolitan image archive application — standing on a street corner, and being able to retrieve both historical and contemporary information about the location — is within reach of this project. PhillyHistory is not integrated into the mobile stack, and so a location must be manually entered, but it is still pretty cool.
PhillyHistory also has a blog, where interesting archival images are discussed, as well as general application updates and news. The site also provides advanced sections where it provides detailed information on how to construct url query strings against specific metadata fields, such as location or time period. Searches can be named (“bookmarked” in the site's nomenclature) and then made available as an RSS. Using GeoRSS, a set of images can be easily displayed within Google Maps.
In a terrifically cool new feature just added this November, the first 100 image search results from any query can be mapped into Google Earth. Clicking on any of the result markers pops open a window with the original archival image. This is fantastic.
PhillyHistory's sustainability model is straightforward, financed in part by taxes, and through the sale of quality image prints (e.g., $20.00 for an 8 x 10 color print).
The app has generated a tremendous amount of enthusiasm in Philly. The locally based Editor of the City Paper, Duane Swierczynski, said in a post, “I've become a PhillyHistory.org junkie … This is the best use of taxpayer money I've heard of in a long time. I'd even be willing pay more taxes … “
We don't normally think of city governments as maintaining currency in software application design, but it happens more often than we realize. At the meeting, someone from NYC was nearly jumping up and down with excitement, at the hope that it would be possible to migrate the application north.
Perhaps west, as well.
We knew this was coming, but missed the announcement in July of Ricoh's GPS WiFi camera. This strikes us as something that can become a high-impact journalism tool. Imagine how it could be applied for covering mass demonstrations or even sporting events. It could also be great for travel stories — everything from walking tours through Scotland to pub crawling in New Orleans — when linked to Google Maps.
The opening day price is about $1,100. Not too much, we think, as an investment for a newroom's digital R&D person/team. (Those do exist, don't they?)
Anyway, check out the link below.
Posted Jul 16th, 2007 by Chief Gadgeteer
The continuing growing popularity of mapping (particularly Google Maps, Google Earth and their street views) and GPSRicoh 500SE Digital Camera that is GPS enabled. Take a photo with the 500SE and it automatically embeds the position info into the photo. In a year or so, this will probably become a pretty standard feature on digital cameras and camcorders, or at least highly coveted. solutions means that consumers will want more products that automatically tie those things together. Enter the
The Ricoh 500SE is no slouch in the camera department either. It is an 8 megapixel CCD, 3x optical zoom, large 2.5″ TFT LCDWiFiBluetooth 2.0 connectivity.
Now back to the GPS stuff. Just imagine how cool it would be to embed your photos automatically in the right spot on a map by adding them as layers to existing maps that have GIS capabilities. Well, nevermind the last part if you don’t get that. Think about how cool it would be if you could pull up your pics in Flikr, Gallery or whatever, and then display a map alongside it that shows where the pic was taken. monitor screen, SD card slot, camera shake blur reduction and a 28mm wide-angle zoom lens. It also comes with 802.11b/g and
We were pleased to see last week (via the NICAR listserv) that multiple newspapers, at least in the U.S., have discovered they can get public records data bases, create specialized look-up tools for their frontends and post it/them on their web site. Let's keep on keeping on with this. It seems quite possibly that the next phase of bringing bits and bytes to the people might well be in the realm of 3D, mapping and simulation modeling. To that end, take a look at the “Terrain Tools & Software Packages” jumpstation. This is a nifty collection of commercial and open-source apps that just make your job easier and more interesting.