SIDEBAR
»
S
I
D
E
B
A
R
«
What newspapers can — should? — being doing in the coming decade
Oct 25th, 2005 by Tom Johnson

Our associate Vince Giuliano had some words of wisdom last week for the
Innovation
International Media Consulting Group
and its audience in Cambridge, Mass. at the
“WHAT'S NEXT: THE NEW MEDIA LANDSCAPE” conference. 

Vince's PowerPoint presentation (no audio) deals with basic trends likely to have
profound affects on our lives over the coming 10 years – and key
implications of these trends for newspaper companies.  You can find
it in HTML format on the new Electronic Publishing Group website

www.epublishinggroup.com



GIS Development magazine: "FEMA's official flood maps called obsolete"
Oct 24th, 2005 by Tom Johnson

This in from the Houston Chronicle via GIS Development online mag:



“FEMA's official flood maps called obsolete



“Official maps that are supposed to guide homeowners and communities on
areas prone to flooding are obsolete and unreliable, a federal
investigation found. Despite a multi-year modernization effort, 70
percent of the maps are more than 10 years old, the inspector general
for the U.S. Department of Homeland Security concluded in a 63-page
report, which also found that many of the flood plains on the maps were
hand-drawn and are difficult to update. The criticism is the latest to
be leveled at the Federal Emergency Management Agency, which has been
widely blamed for mishandling the Hurricane Katrina relief effort.

“As
part of its management of the National Flood Insurance Program, FEMA
maintains more than 90,000 maps to show areas where flood insurance is
advisable and where construction would be risky. However, new
developments in flood zones have generally rendered the maps inaccurate
and obsolete. Faulty maps have a major impact on people and property
owners. Local communities rely on these maps to help them limit
construction within flood zones and to determine who can buy federal
flood insurance.

“The inspector general's report raises serious
questions about federal funding for the modernization effort, a $1.5
billion, six-year project that is intended to post accurate and easily
updated digital maps on the Internet by 2010. The program already is
behind schedule, and many state governments said that federal funding
is far short of what they need to provide correct mapping information.

Source : http://www.chron.com





Newspaper hand-rolls some Google Maps for hurricane coverage
Oct 24th, 2005 by Tom Johnson

Maurice Tamman, of the Herald Tribune, Sarasota, Fl, posts to the NICAR (National Institute of Computer-Assisted Reporting) listserv:

“In recent months we’ve been experimenting with Google
Maps APIs to bring dynamic maps to our coverage. (Last month we used it to
illustrate how Florida
property tax system creates crazy inequities: www.heraldtribune.com/saveourhomes/)

“Late last week, we slammed together a hurricane damage entry
and reporting system for the six Southwest Florida counties, from the Keys to Manatee County. Users can zoom to a neighborhood
and either view reported damage or report damage. (www.heraldtribune.com/damages/)

“I’m not sure how much use it’ll get because the
storm stayed so south of us. Still, I think it illustrates the flexibility of
the Google system over more expensive GIS server solutions, especially for
smaller papers.”


Good job in seeking to employ a creative application of existing tools.



3-D views of Hurricane Wilma
Oct 22nd, 2005 by Tom Johnson

NASA's hurricane site has posted some novel maps and graphics of Wilma, including some 3-D and animated illustrations of the “hot towers.”



The arrival of Hurricane Wilma on October 15, 2005, tied the record for
most named storms in a single Atlantic hurricane season. Within just
days Wilma went from tropical storm to Category 5 hurricane status and
broke the record for lowest pressure ever recorded inside a hurricane.
New satellite observations show towering thunderclouds, sometimes
called hot towers, that signaled the onset of intensification in this
remarkable storm.





Why is the U.S.A. a "Developing Nation" when it comes to broadband?
Oct 19th, 2005 by Tom Johnson

“October 18, 2005
Broadband's Crawl


Posted by Rekha Murthy at October 18, 2005 10:07 AM in Telecom/Internet.

An excellent article in today's Salon
magazine provides a thorough assessment of the state of broadband
access in the United States. The U.S. continues to fall behind other
countries in broadband penetration. The problem, according to the
article, stems from federal mismanagement of telecom policy and
misrepresentation of the current levels of broadband access and quality.

The digital divide seems to widen with each advance in technology —
even when a technology emerges that could make providing access cheaper
and easier. The divide runs along familiar lines of class and geography
(rural vs. urban), and the line between regions that can attract new
businesses and residents and those that can't. It can also be seen as a
divide between those with better access to news and information and
those without access.

The article also puts in fresh perspective efforts by municipalities — San Francisco
being the most recent and prominent — to provide broadband Internet
services directly to their citizens. Telecom companies claim that this
stifles the competition that can lead to lower prices and better
quality. And yet most Americans have neither.”



Helluva deal on ArcView for IRE members
Oct 19th, 2005 by Tom Johnson

GIS software discount for IRE members

Return to IRE Training



Members of Investigative Reporters and Editors, Inc., qualify for
discounts on geographic information system (GIS) software from ESRI,
the publisher of ArcView.
ESRI is offering ArcView GIS single use licenses at no charge to IRE
members
who agree to attend a GIS training event conducted by IRE and
NICAR or ESRI. Purchasers must sign a three-year maintenance agreement
with ESRI at a cost of $ 400 a year, with the first year's fee waived.
ArcView, the GIS program most widely used by journalists, lists for
$1,500. During the maintenance agreement period, purchasers will
receive software upgrades and technical support.

IRE members must attend a qualifying training session within one
year of entering the agreement with ESRI, which is based in Redlands
Calif., and has been a regular exhibitor at the annual IRE and CAR
conferences.
Qualifying sessions are IRE and NICAR's Mapping Data for News Stories
mini-boot camp, offered two times a year with the next scheduled for
Jan. 6-8, 2006; an online ESRI Virtual Campus course, and ESRI
classroom training.

For more information about IRE and NICAR training visit IRE Training . For more information about ESRI training see www.esri.com/training_events.html.
IRE members can also purchase discounted extension programs, which
expand the analytical capabilities of ArcView. The single-license cost
for Spatial Analyst, 3D Analyst and Geostatistical Analyst is $1,500
each. That is a 40 percent discount off the list price of $2,500 each.
To obtain an order form, please contact John Green, membership services
coordinator for IRE, at jgreen@ire.org or 573-882-2772.



So how do we measure sprawl? With precision?
Oct 18th, 2005 by Tom Johnson

Kenneth
Chang, of the NYTimes, had an interesting “Ideas & Treands” piece
yesterday.  He was writing about the evolution of standards and
precision in measurement.  Just how long is a meter — a REAL
meter?  There can be measurement of physical things, of course,
like distance or weight.  But perhaps journalists should be
talking about the standard defintion of concepts such as “urban sprawl”
or a “landslide victory.”



October 16, 2005


Measuring the World: From Material to Ethereal



LOCKED in a vault in Paris is a cylinder about the size of a plum. Its mass is exactly one kilogram. It is the kilogram.

For 116 years, this cylinder made of platinum and iridium has been
the world's defining unit of mass. It's an easy concept to understand.

Scientists at the National Institute of Standards and Technology in
Gaithersburg, Md., announced last month significant progress toward
supplanting this cylinder. Their concept is not so easy to understand.

It's a two-story-tall contraption that looks one part Star Trek, one
part Wallace and Gromit. Briefly put, it measures the power needed to
generate an electromagnetic force that balances the gravitational pull
on a kilogram of mass.

“It's such a very complicated thing that's hard to explain,” said
Richard Steiner, the physicist in charge of the project. He has been
working on this “electronic kilogram” machine for more than a decade.

“That's what everybody kind of laughs at,” Dr. Steiner said.
“They're all impressed it's such a complicated thing and then they ask,
'What do you need it for?' “

The general answer is that humans have always needed to quantify and
standardize, to make their world more certain. Without a standard
kilogram – roughly 2.2 pounds – how would scientists know their
measurements of mass were accurate? Without a standard meter, how would
a manufacturer make a ruler and know that it is precise?

More specifically, the high-tech kilogram is needed because
scientists prefer a definition based on the universal constants of
physics – something they could in principle calibrate in their own
laboratories – rather than on an artifact sitting in a distant vault.

Another problem with the kilogram cylinder is that it is not
necessarily unchanging. Over time, contamination might add smidgeons of
mass, or cleaning might scrub away some atoms, leaving a lesser
kilogram. Better, scientists say, not to have to worry about dust, dirt
or disaster striking the Paris vault.

The kilogram, in fact, is decades behind the meter, which used to be
defined as the distance between two scratches on a metal bar. In 1960,
scientists defined the meter in terms of the wavelength of a specific
orange light emitted by krypton atoms. In 1983, they redefined the
speed of light to be exactly 299,792,458 meters per second, so a meter
is now just the distance that light travels in a vacuum in
1/299,792,458th of a second.

The newer definitions hark back to the original metric definitions,
which were based on features of the natural world, not human artifacts.
A kilogram was the mass of water filling a cube that is one-tenth of a
meter on each side, or one liter of volume, and a meter was one
ten-millionth of the distance from the North Pole to the Equator, along
the path passing through Paris (since it was the French Academy of
Sciences that defined the meter).

Neither definition proved practical, and the French scientists
botched their calculation of how much the Earth is squashed by the
centrifugal force of its rotation, so the metal bar they made to
represent a meter was off by a fraction of a millimeter.

It is also not easy to measure precisely a liter of pure water,
which is complicated by impurities and gases dissolved in the water and
by how water density changes with temperature and pressure. Instead,
that platinum-iridium cylinder was established as the official
definition, in 1889.

The search for standards began with the rise of civilization.
Measures were needed, especially for commerce. At first, people simply
used parts of the body. A cubit, for example, was the distance from the
elbow to the tip of the middle finger – which differed from person to
person, until an Egyptian pharaoh declared a cubit to be the distance
from his elbow to the tip of his middle finger (and possibly the width
of his palm).

It was hardly convenient to borrow the pharaoh's arm to measure a
bolt of cloth, so a piece of granite was carved and declared the
official cubit. Other people would make their own cubit rulers, usually
out of wood, based on the granite standard.

The same idea underlay the standards for the kilogram and the meter
– a cylinder and a bar, respectively. “Those were not bad standards at
the time,” said John L. Hall, a scientist at the Institute of Standards
and Technology and a winner of this year's Nobel Prize in Physics, who
helped refine the definition of the meter two decades ago. “But they're
kind of hard to duplicate and disseminate.”

Dr. Steiner's team with its two-story contraption has now fixed the
mass of a kilogram to 99.999995 percent accuracy. To satisfy the
international body that sets measurement standards, they probably need
to raise that last “5” to an “8.”

As science measures ever tinier bits of the universe, measurement
must become more precise. If scientists can define units in terms of
constants like the speed of light and the charge of the electron, then
they can better study whether constants really are constant. “It's a
much more serious question than it appears to be,” Dr. Hall said.”



War and Power Laws and Journalism
Oct 15th, 2005 by Tom Johnson

The concept of Power Law distributions
is attracting growing interest, especially among folks in the
Complexity and Complex Adaptive Systems communities.  For
journalists, some of the math involved is somewhat more complex than
the elementary descriptive statistics we deal with, but it's not that
tough to grasp the implications of research probing Power Laws as they
apply to various phenomena.

Here's a perspective on global warfare that might prompt some deep contemplation for journalists.

Original source:

http://globalguerrillas.typepad.com/globalguerrillas/2005/09/wars_new_equili.html



WAR'S NEW EQUILIBRIUM

“In
technology, particularly in information based systems, advances can
occur almost overnight. This likely applies to warfare as it becomes
more information-based. As in technology, patterns and methods of
warfare tend to stay within bounded equilibria depending on the type of
war being fought. When an improvement arrives, the equilibrium point
changes and warfare undergoes a rapid shift.


One of the ways to measure a equilibrium point was first demonstrated
by Lewis Richardson over 50 years ago. He calculated that the
distribution of casualties in conventional wars follow a power law
distribution. Updates to his work show that this pattern of
distribution continues to hold.

In a new paper by Johnson, Spagat, and others called “From Old Wars to New Wars and Global Terrorism,” (
PDF
) — http://xxx.lanl.gov/pdf/physics/0506213/ — the authors demonstrate that a new pattern of war is emerging. To do
this, they analyzed the frequency-intensity distributions of wars
(including terrorism) and examined their power law curves. They found
that conventional wars had a power law exponent of 1.8. An analysis of
terrorism since 1968 found that the exponents were 1.71 (for G7
countries) and 2.5 (for non-G7 countries). This makes sense,
conventional wars and G7 terrorism are both characterized by periods of
relative non-activity followed by high casualty events (highly
orchestrated battles). Non-G7 terrorism is a more decentralized and ad
hoc type of warfare characterized by numerous small engagements and
fewer large casualty events.


Powerlaw

Here's
where the analysis gets interesting. When the author's examined the
data from Colombia and Iraq, they found that both wars evolved towards
the coefficient for non-G7 terrorism (although from different
directions). This finding doesn't fit the prevailing theories of
warfare. A conventional understanding of fourth generation warfare
, such the one posited by Thomas Hammes in the Sling and the Stone
posit that 4th generation warfare began in earnest with Mao. However, within
Mao's formulation

(and Ho Chi Minh's variant), guerrilla wars are but a prelude to
conventional war to seize control of the state. The power law for these
wars should, based on this theory, tend towards the coefficient we see
for conventional wars. In fact, we see the opposite. Guerrilla wars in
both Colombia and Iraq have stabilized at a coefficient far from
conventional warfare.

This has broad implications for 4th
generation warfare theory — which clearly dominated the types of wars
we saw in the latter half of the twentieth century. The patterns of
conflict we see today in Colombia and Iraq are a break from the
previous framework (which may be an example of punctuated equilibrium).
Unlike the previous models of guerrilla wars which sought to replace
the state, these new wars have moved to a level of decentralization
that makes them both unable to replace the state and extremely hard to
eliminate. Is this new evolutionary equilibrium a fifth generation of
warfare? It is extremely likely. This new form of warfare, or what I
call open source warfare, is what this site (and my book) is dedicated
to understanding.”



Searching podcasts? Yes, the tools are coming along.
Oct 12th, 2005 by Tom Johnson

Print
journalists often ignore audio (and video) content when researching a
story.  Partially there is the “medium bias” at play (i.e. “Hey, I
work in print, so that must be the most important source.”), but that
bias also has something to do with the lack of search tools and the
difficulty of getting those audio words into a transcript that can flow
into text.  Still, there is gold in those sight-and-sound files
for a reporter who can find them and take the time to extract the ore.




The always helpful blog
“PI News Link” run by Tamara Thompson posts the following:


“A new form of audio files called podcasts,
so named because they can be downloaded from the Internet to a portable
digital listening device (such as an iPod), are searchable through many
search engines.
Yahoo has just rolled out their podcast search. A keyword search of “legal” returned Involuntary Manslaughter: A Double Standard?, a broadcast with the editor of Massachusetts Lawyers Weekly. The Podcast Search Service catalogs a more extensive collection of websites with podcasts, searching terms within the site title or description. Pod Spider includes international audio files. Individual podcasts are beginning to be tagged, which will enable the searcher to uncover specific relevant audio files.”




Finnally, somebody is starting to get it. Sorry, Yanks, it's in the UK
Oct 11th, 2005 by Tom Johnson

A
posting today announcing an academic chair at the University of Central
Lancashire Department of Journalism seems to indicate that someone in
the industry there is starting to ask the right questions and seeking
to leverage the strengths of the profession and its academic
counterpart.




In a time when the U.S. journalism establishment is just contributing to academic redundancies (see “Columbia and CUNY Get Grants in Journalism“), 
UK Publisher Johnston Press is asking if there might not be a better
way to think about, understand and deliver journalism.




From a press release:



SPONSORED CHAIR IN DIGITAL JOURNALISM DEVELOPMENT



” The University of Central Lancashire Department of Journalism is to join forces with major UK Publisher Johnston Press in an

exciting new initiative that aims to exploit the benefits of new and emerging digital technology.



The three-year collaboration, worth around *200,000, includes the future appointment of the Johnston Press Chair in Digital Journalism Development at the University.



Tim Bowdler, Chief Executive of Johnston Press, said: “The rapid evolution of digital technology presents huge opportunities and challenges to traditional media companies.



“Through the newly established Chair in Digital Journalism, Johnston Press is delighted to partner with the Department of Journalism and to give added impetus to its already well

recognised commitment to exploring new forms of factual content creation, production and dissemination.



“Johnston Press is determined to take maximum advantage of the new opportunities which digital developments present and our partnership with the University will undoubtedly further this aim.”



In post by January 2006 and funded by Johnston Press, the Chair will form the cornerstone of the partnership between the two organisations. UCLan will also fund a research assistant to assist the Chair in drawing up a research strategy that defines new approaches/methods to:



• the exploration of digital applications for content acquisition (e.g. multi skilled reporters and reporting technologies)



• the exploration of digital applications for content production (copy flow, editorial management and logistics in the multi-media

newsroom of the future)



• exploring digital applications for content dissemination, including multi-media content converged onto one dissemination platform (e.g. the Web); but also the simultaneous dissemination of content on multiple platforms (e.g. hard copy, the Web and mobile)



Major implications



Head of the Department of Journalism Mike Ward said: “By the end of this decade, it’s forecast that there will be up to 1.5 billion computers connected via high-speed broadband and another 2.5 billion phones with more processing power than today’s PCs. This will undoubtedly have major implications for journalists and

publishers alike.



“UCLan’s partnership with Johnston Press, which combines the expertise of one of the top journalism departments in the country with one of the UK’s major regional newspaper groups, presents us with a unique opportunity to investigate, challenge and inform development and debate in digital applications.



“The fruits of the partnership will be relevant, accessible and forward-looking analysis. Together we will produce materials for teaching, knowledge transfer and further research.”

________________



\_ Alan Rawlinson

\_ Course leader, MA in Online Journalism

\_ University of Central Lancashire

\_ www.ukjournalism.org

\_ \_ \_ \_ \_ \_ \_ \_ \_ \_ \_ \_ \_ \_ \_ \_ \_ \_



\_ agrawlinson@uclan.ac.uk

\_ alan@rawlinson.co.uk

\_ 01772 894757

\_ \_ \_ \_ \_ \_ \_ \_ \_ \_ \_ \_”






»  Substance:WordPress   »  Style:Ahren Ahimsa