Monday, January 29, 2007
Sunday, January 28, 2007
Incident: Homicide, Kenya
1/29/07 Update - CARE announced regional director Geoffrey Chege, a 25-year veteran of the organization, was killed during an attempted carjacking. Condolences to family, friends and colleagues.
Saturday, January 27, 2007
Reuters has a good summary article on the current levels of insecurity in Sudan entitled, Attacks on aid groups cripple Darfur relief.
Darfur is an interesting case study from a high-level, security management standpoint. I know of at least one large NGO that was advised to suspend operations in Darfur due to insecurity, but continues its programs mostly because senior management believes the public relations benefits outweigh the staff exposure to risk.
In my opinion, most humanitarian organizations don't do a very good job of clearly establishing and articulating acceptable levels of risk to staff. It would be useful for senior management to brush up on some basic economic theory in cost/benefit ratios, especially as it applies to staff members.
What are we willing to risk to achieve our mission? Is the death or rape of a staff member simply the cost of doing business? Is one death OK, but will two cause us to suspend operations? Is the humanitarian work we are performing really making a difference compared to the level of personal risk staff is experiencing? Very hard questions, that are all too infrequently asked.
Incident: Suspension, Guinea
Tuesday, January 23, 2007
InterAction Haiti Chatroom
Incident: Arrests/assaults, Darfur
Friday, January 19, 2007
Chad: Aid agencies still on war footing as insecurity continues
Wednesday, January 17, 2007
UN warns Darfur's aid operation may collapse
Incident: Ambush (fatalities), Iraq
Tuesday, January 16, 2007
NGO Security PowerPoints and Training Thoughts
I've successfully used these files in different classes over the years, but I've recently changed my way of thinking about security training and am moving away from overwhelming students with too much information (some irrelevant for their context, a lot that they end up forgetting). I now believe it is important to teach people the more generalized skill of making good decisions under stress and then teach them how to perform a small core of key, job-related tasks very, very well. To me, this is a more effective training approach in preparing people to deal with the realities of the field (or headquarters).
Monday, January 15, 2007
More Bird Flu
UN meet on humanitarian security
Saturday, January 13, 2007
Incident: Avian flu (fatalities), Indonesia
Thursday, January 11, 2007
Choosing Good Passwords
Incident: Ambush (fatality), Sudan
The Janjaweed's new clothes
Wednesday, January 10, 2007
Many people don't get the power of the Internet when it comes to information and achieving "real" transparency - which is much more than a nice sounding buzzword used for marketing and public relations purposes. There are other popular disclosure sites such Cryptome and The Memory Hole, but WikiLeaks is the first to take a very active and collaborative approach. It will be interesting to watch its debut in the coming months.
Tuesday, January 09, 2007
Incident: Shooting (fatality), Iraq
Friday, January 05, 2007
Aid workers beaten, raped in Darfur
Full Le Monde article in French here, also a rudimentary English translation (thanks to Google).
Readers who have additional information on the Gereida incident are encouraged to comment (anonymously if appropriate).
US PMC contractors and the law
Thursday, January 04, 2007
10 Steps to Creating Your Own IT Security Audit
While information security does tend to have more technical elements than most aspects of NGO security (which can be a little intimidating to some people), the basics are grounded in common sense.
Here's a good article that lists 10 steps for performing your own IT security audit. It's fairly easy to read and and provides the fundamental questions to ask if you ever need to perform an IT-related security assessment. From doing computer security work for a number of years before becoming involved with the humanitarian community, I give the authors two thumbs up for covering most of the basics in a concise and easy to understand way.
Wednesday, January 03, 2007
Journal of International Peace Operations
Camera memory card risks
UN fatalities in 2006
Foreigners advised to leave Gaza
Security officials said they have advised American and European nationals to leave because of a threat of further abductions.
Another security source said that the advice was in part directed at expatriate employees of UNRWA, the UN relief agency for Palestinian refugees. But UNRWA said it had no plans to withdraw its foreign nationals." More here.
Oil firm risked hostages' lives
Tuesday, January 02, 2007
Peter Sandman and Risk
One of Sandman's basic premises is Risk = Hazard + Outrage. I'll take the liberty of quoting him on how this differs from the more traditional Risk = Impact x Probability.
Risk assessors in various fields — from health and safety to insurance to business continuity — always end up with some variation of the“impact times probability” formula. Instead of “impact,” other formulations say “magnitude” or “consequence”; instead of “probability,” they sometimes say “frequency.” But everyone agrees that the technical and financial calculation of risk (as opposed to the cultural or psychological calculation) means multiplying two factors: how bad it is and how likely it is.
As you know, this is easier said than done. Measuring probability is a methodological can of worms, especially for awful things that have never happened. (How do you even list all the awful things that have never happened, much less calculate their probability?) Even for chronic risks, there is often very high uncertainty surrounding a probability estimate — consider, for example, the probability that low-level emissions of dimethylmeatloaf contribute to the incidence of birth defects. But at least we know exactly what we’re trying to measure. Measuring risk magnitude (“impact” in your terms) is a conceptual and moral can of worms. What is the relative magnitude of the death of a human versus the extinction of an animal species? The death of a healthy child versus the death of a virtuoso violinist versus the death of a terrorist versus the death of an Alzheimer’s patient?
Nor is there universal agreement that risk magnitude and risk probability deserve equal weighting, as the formula suggests. Most people intuitively think that really horrific outcomes (the end of life on earth, say) are unacceptable even if they have commensurately low probabilities. We prefer a lower-magnitude higher-probability risk, even though the magnitude-times-probability product is the same. But at the other end of the distribution, we are also profoundly uncomfortable with sacrificial lambs: If we just let them kill this one person, the odds of a good outcome will improve markedly for the rest of us.
Even when both magnitude and probability are well-established, people have very different responses to mathematically equivalent risks. Some of this is attributable to perceptual heuristics — some risks are more vivid and emotionally resonant than others, for example. But some of it is real judgment that risks deserve to be assessed by more standards than just their magnitude and probability. Most people know that they are likelier to die in a fatal car accident than in a terrorist attack. Most people know that a million dollars spent on highway safety will reduce mortality more than a million dollars spent on homeland security. But compared to highway safety, terrorism isn’t just more vivid, more emotionally resonant, less familiar, more dreaded, and the rest. It is also more important — morally, politically, culturally. We are willing to spend more on it per life saved.
All these complexities, discontinuities, and inconsistencies are the venue of what I usually call outrage. In my “Risk = Hazard + Outrage” formula, my “hazard” is what you mean by “risk” — that is, magnitude times probability. “Outrage” is all the other considerations that make us assess some risks differently than others, including the considerations that make us right to do so. When I first coined the formula, I was working mostly on environmental controversies, and “outrage” nicely captured the mix of anger, righteous indignation, and worry that people felt about industrial pollution. It isn’t really the right word for some other sorts of risks. The same “outrage factors” explain why people are more attentive to West Nile Virus than to flu, notwithstanding the fact that flu is by far the bigger risk technically. But people are more fearful than angry about West Nile Virus. It feels off to call it “outrage.” Years ago, Sheldon Krimsky and Alonzo Plough wrote about “technical rationality” versus “cultural rationality.” That captures the distinction in a more universal way than “hazard" versus “outrage.” I tend to stick to my own formulation; it’s shorter and it’s mine.