GPS accuracy could start to drop in 2010

May 17, 2009

gps

A new US GAO report has found that organisational factors in the US Air Force’s contracting and budget management process may result in decreased accuracy or even failure of the global GPS system, starting in 2010.

From the report:

The Global Positioning System (GPS), which provides positioning, navigation, and timing data to users worldwide, has become essential to U.S. national security and a key tool in an expanding array of public service and commercial applications at home and abroad. The United States provides GPS data free of charge. The Air Force, which is responsible for GPS acquisition, is in the process of modernizing GPS. In light of the importance of GPS, the modernization effort, and international efforts to develop new systems, GAO was asked to undertake a broad review of GPS.

The report reviewed the Air Force’s replacement programme for the ageing GPS satellites and that,

“If the Air Force does not meet its schedule goals for development of GPS IIIA satellites, there will be an increased likelihood that in 2010, as old satellites begin to fail, the overall GPS constellation will fall below the number of satellites required to provide the level of GPS service that the U.S. government commits to. Such a gap in capability could have wide-ranging impacts on all GPS users, though there are measures the Air Force and others can take to plan for and minimize these impacts.”

It concludes, “it is uncertain whether the Air Force will be able to acquire new satellites in time to maintain current GPS service without interruption. If not, some military operations and some civilian users could be adversely affected.”

Commentary

We have become so dependent on GPS in many ways over the last 5 to 10 years.  Crowd sourced crisis mapping, rapid disaster response, and large force co-ordination all depend on GPS and location awareness abilities.

I would love to see a scenario play out whereby aid, development and military organisations invest increasing resource on such advanced location aware technologies, only to have them fail or decay.  What would such a scenario look like?

Obviously the US military won’t let the system fail.  A commentary on TidBITS writes that, “even if the satellite constellation drops below 24 satellites, that doesn’t mean that GPS service will fail altogether. It does mean that the level of accuracy that both military and civilian users have become accustomed to – which is actually higher than promised – may degrade significantly.”

Alternative systems also may come online in the coming years.  The EU is developing a civilian GPS system called Galileo, scheduled to come online in 2013, and the Russian GLONOSS system may be repaired as well (the system was developed in 1995, but fell into disrepair due to lack of funds.  It has been promised to come back online by 2010, but there are doubts about this).

It is likely that the US Air Force will fix the system before disruptions become critical.  It is also likely, however given the history of bureaucracy and budgetary inflation at the Pentagon (see the F-111, B-1, or F-15 debacles for case studies), that these repairs won’t be done in a timely or efficient manner, but only at great expense and with great fanfare and inefficiency after the fact.


Mapping disasters in 3D

April 5, 2009

 

Robin Murphy from Texas A&M University (TAMU) create software to reconstruct 3D scenes of disasters from 2d photographs taken by flying unmanned aerial vehicles (UAV’s).

Picture this; an earthquake devastates a major Chinese city.  Rubble is everywhere, no one knows where the survivors are.  

A team of researchers suggests a new system may help first responders gain a better understanding of their environment through the use of flying robots and 3D reconstruction software.  

[The system] deploys several small unmanned air vehicles (SUAVs), such as AirRobot quadrotors, to take snapshots of the rubble. The pictures are then uploaded to a software program called RubbleViewer, which quickly builds a three-dimensional map of the area that users can intuitively navigate. More efficient than drawing by hand, this system is also cheaper and more portable than the alternative–using helicopter-mounted lasers to map the rubble.

Last time I checked “using helicopter mounted lasers to map the rubble” was still a tad beyond most humanitarian budgets.  But who knows what wonders the G20 stimulus package might provide?  In any case, it’s an interesting proof of concept that could be scaled to market over time, thus lowering the price and becoming potentially useful to combat-style first responders in urban environments in the future.


G20 update; crowdsourced crisis information live

April 1, 2009

crisisjpg

More cameras than protesters?

It appears that the amount of media presence, Internet and otherwise, is having a large magnifying effect on the perception of the protests themselves.  A Google blog search at 16:36 found over 997,000 blog hits for “G20 protesters“, over 3 million web pages, and nearly 15,000 news items.  That is probably several orders of magnitude higher than the number of protesters actually at the site.

Traffic on Twitter seems mixed, with a large amount of discussion on the role and impact of the media.  Notes such as;

4.28pm:
On Twitter, Snufkin21 says Stop the War protesters booed the mediapresent “for hyping up the G20 violence”. The huge media presence has been criticised by a number of people on Twitter who believe it’s encouraged extreme elements to “play to the gallery”.

Are intermixed with live accounts such as;

4.07pm: 
Police on horses have carried out two charges down Threadneedle Street in a bid to disperse protesters, says Alok Jha, who is at the scene. Alok said everything had been calm beforehand and demonstrators have not been impressed by the police response.

Mixed messages

There are a variety of conflicting claims being made, about the magnitude of violence, the presence of the media, and the reaction of the police.  Reading the Twitter feed directly (Twitter search, #G20) yields a confusing array of opinion, advice, updates, and news items.  Including the following excerpts;

DanSpringWell done the Police today at #G20. Felt safe in Romford disaster recovery office!

pgb63Stephen Harper tasered at G20!

AcostafGetting far away from #G20, Cannon st and London Bridge working as normal

Protests + Internet = Force Multiplier

This is an interesting mix of content, coming fast and furious from all angles.  It is reminiscent of  many aspects of the ongoing discussion between Paul Currion and Patrick Meier, about the issues around crowdsourced crisis data.  Paul writes that, “crowdsourcing is unfamiliar, it’s untested in the field and it makes fairly large claims that are not well backed by substantial evidence.”

Among it’s dangers, we argue, is the rapid magnification of false or intentionally deceptive data.  Patrick calls this “crisis magnification“.  We discussed it in depth in here.  What we are seeing with the G20 protests in an excellent example of this.  No matter what is actually happening on the ground, there is a massive, unfiltered, and confusing amount of information being generated. 

Here comes everysource

While we can argue that this massive generation of fresh live content is a good thing (as Patrick does here), it is clearly a new thing, with uncertain outcomes and value.  Clay Shirky, author of “Here Comes Everybody”, writes about the problem of filtering this massive amount of information.

The old ways of filtering were neither universal nor ideal [referring to TV and print]…  Mass amateurization has created a filtering problem vastly larger than we had with traditional media; so much larger, in fact, that many of the old ways are simply broken.

What is interesting, he notes, is that although it is possible to for everyone to read anyone’s blog, Twitter feed, or website, it is not possible for anyone to read everyone’s information.  

In fact this isn’t the way it works.  The vast majority of this content is generated by people close to or within your social network (personal or professional) and is intended for people close to or within your social network.  Web 2.0 and social media has taught us that although there may be over 250 million blogs on the Internet, the vast majority of this information is local, read by practically no one, and only intended for your friends and colleagues. 

Crowdsourced crisis information FOR YOUR FRIENDS

Could the same principle apply to crowdsourced crisis information?  Crowdsourced crisis information won’t be broadcast in the traditional sense, with millions of people listening to a single source.  Instead, communities of trust and relationships will build over time, with individual air workers, responders, and agencies building networks of trusted information sources just as we build networks of trusted friendships.

It isn’t as if field workers and disaster responders will go to any old Twitter feed they find and trust any bit of information they get texted about.  As in all social media, they will go to the ones they trust first.  Crowdsourcing doesn’t change this, it just lower the bar of entry for participating in the conversation.

Paradoxically, this new flood of information might end up having the opposite effect; crisis responders who only pay attention to their trusted sources and ignore the “weak signals” flooding in from all sides at once.  

Or maybe the opposite might happen?  Maybe the promise of crowdsourced crisis information has nothing to do with aid responders.  Maybe it will work just like Facebook, MySpace, and other forms of social media work.  A bomb goes off in your neighbourhood, you text your friends, who Twitter their work colleagues, who call their parents, who write a blog post read only by their cousins.  Maybe the AP will pick it up, maybe there will be value for figuring out how many dead and wounded need treatment.  

But at the end of the day, perhaps crowdsourced crisis information isn’t all that different from any kind of shared information; it all depends on who you talk to.


Paul Currion on the “crisis” of crowdsourcing in a crisis

March 31, 2009

 

Paul Currion (humanitarian.info) has started an excellent critique of crowdsourced information in crisis, responding to two excellent posts by Patrick Phillipe Meyer (iRevolution).

Instead of incestuously summarising here, I refer readers to Patrick’s original posts:

And then to Paul’s critique here:

As well as an HFP blog related plug here:

We hope Patrick replies. Updates to follow as they emerge.


Free guide to GIS mapping for aid organisations

March 27, 2009

mapaction

MapAction just published a free guide to GIS mapping for aid organisations.

The guide covers quite a comprehensive range of topics, from basic mapping concepts, through to data entry, up to issues of data sharing and representation.

From the introduction:

The guide was written to meet the need for practical, step-by-step advice for aid workers who wish to use free and open-source resources to produce maps both at field and headquarters levels. The first edition contains an introduction to the topic of GIS, followed by chapters focused on the use of two recommended free software tools: Google Earth, and MapWindow. However much of the guidance is also relevant for users of other software. In addition there is a chapter on using GPS to collect data during humanitarian emergencies.

The full report can be found here.  This is the first time we’ve come across MapAction as an organisation.  Has anyone worked with them before?  Does anyone know how this relates to the Crisis Mappers group?


The promise and peril of crowd sourcing crisis information

January 11, 2009

Several excellent example of how mobile phones are being linked to the web to create new crisis reporting (and response) systems, as well as several examples from recent conflicts of how such tools can be used as another weapon of war.  

In this video from Pop!Tech, Ken Banks explains how his software is being used by various humanitarian NGO’s for quicker reporting, monitoring, and mobilisation.  Ken is the founder of kiwanja.net, a site which helps local, national and international non-profits get their jobs done through mobile phone services.  

Kiwanja.net makes software called FrontlineSMS, which is being used by another piece of brilliant software application called Ushahidi (Ushahidi is also a finalist in the USAID Development 2.0 Challenge)  Ushahidi allows people to:

•    send and receive SMS alerts;
•    set up a local or international alert number at short notice;
•    work on different smartphones;
•    send MMS messages (images and video);
•    send GPS coordinates.

Forbes.com has an excellent article on how this is being used to cover emerging humanitarian crises. Al Jazeera is already using this to cover the crisis in Gaza, the software is in use in the DRC right now (also on the BBC website here), and for AIDS relief projects in Malawi.

These kinds of real-time disaster discovery and reporting technologies are likely to play a larger and larger role in the humanitarian sector over the coming decade. But there are dangers to these developments as well as opportunities. Jeremiah Owyang, a senior strategist in social computing at Forrester Research, reports on how Twitter is being used to report disasters. He observes several risks when relying on these technologies;

1) Sources may panic, and over or under state the situation.
2) Determining who is a credible source is a challenge,
3) Echos from the online network may over pump or mis state very important facts that could impact people’s safety.

He argues that lessons from a recent explosion in Toronto offer several key take away thoughts:

  • The new News Wire is now Twitter, the “Twire”?
  • News continues to break from first hand sources, in the past, the press would break the stories.
  • The jobs of the press are both easier and harder: They’ve improved access to sources in real time, but the level of noise has increased.
  • Press and Media must monitor Twitter: we’ve never seen information break as fast as this.
  • Press still have a very important role: vetting out what’s true and false to the best of their ability.
  • The community must be mindful of what’s real and what’s not, over hyping or spreading false information could impact lives.
  • Emergency response teams and local municipalities should monitor the online chatter, just as they do emergency short wave channels.

All of this is tremendously important for humanitarian researchers, field workers and strategists to consider when integrating these technologies into their work.  Crowdsourced news sources can cut both ways.  We already know how politically biased official reporting of disaster impacts can be; governments are prone to over- or under-report numbers as per their political preference.  If anyone can report anything now, and an eager news media is prone to catch the scoop and broadcast it loudly, how might local political vendettas play into the disaster response process?  One imagines that natural vetting sources, including reputation ranking, will likely arise to counter-balance what is otherwise a total free-for-all.  But we’re still very much in the Wild, Wild West when it comes to these frontiers.

Read the rest of this entry »


Follow

Get every new post delivered to your Inbox.