SECURITY ISSUES are the Cinderella of field-based research, with organizations tending to gravitate toward either severe neglect or over-the-top fanfare—with advisors, trainings, forms, processes and procedures creating the appearances of full control, when no such thing is possible. Security policies often verge on the absurd: embassies write off whole areas as “red zones” when there is much grey and green within them; organizations commission expensive drills on deadly violence while ignoring equally or more important first-aid skills; and expats develop a bunker mentality that makes them even more vulnerable, by disconnecting them from their environment.
The difficulty in achieving the right balance on security issues is understandable, given the complexity of the problem: threats are multiple, dynamic, often ambiguous, and generally specific to each individual in a given context, making it difficult to design a standardized response. Moreover, addressing security issues is a matter of aligning multiple, very different logics. Shifting dynamics on the ground; informal personal routines; and formal organizational frameworks all play a role, and often make for awkward bedfellows. The danger comes from the fact that the first item may change faster than the second, which itself may quickly overtake the last.
Given the impossibility of a one-size-fits-all template, the emphasis here is on expounding an overall methodology that can be adapted to a particular context. Two essential notions are awareness—which hinges on the researcher in the field—and responsibility, which ultimately rests with whomever assumes a supervisory role. These, in turn, will require flexibility and effective communication in the moment, but also a measure of forethought and intellectualization: the more a researcher and manager have analyzed the risks faced in the field, the better their chances of keeping steady when the going gets rough.
Field research in precarious circumstances will generally rest upon a “system” that researchers themselves develop—consciously or otherwise. A researcher’s behavior is informed by a set of assumptions about the environment; various resources—including friends and colleagues—used to assess potential threats; and past experiences. Based on these, a researcher will define rules on where to go and not go; whom to rely upon or avoid; how to communicate with various networks; in what form and place to keep notes, etc. The effectiveness of the “system” that ties together all these decisions will improve considerably through self-awareness, on the part of researcher and manager alike. Recurring features are therefore important to consider:
- There are as many systems as there are people and situations;
- Systems are very personal and often instinctive—they may not be strictly rational, but rather drawn on beliefs, fears and comfort zones that have little logical basis;
- They develop over time, through incremental trial and error;
- They tend to be remain informal—even when formal policies are enforced—because the latter are subject to being ignored, re-interpreted, or overtaken by spur-of-the-moment decisions and unforeseeable events;
- Last but not least, systems are intrinsically hard to discuss openly, for a variety of reasons—ranging from distrust, superstition and paranoia (which are part-and-parcel of working under threat) to conflicts of interest with top-down policies, through to the embarrassment felt at their improvised or pretentious character.
Indeed, researchers in challenging environments sometimes conclude that “no one understands” the exact nature of what they must handle, and therefore keep mostly silent about their modus operandi. That is, precisely, the greatest liability: systems are best intellectualized and enhanced through constructive discussion with others. General lessons-learned, from a variety of contexts, include the following:
- The biggest threat may well be… yourself. Your profile and/or the work you do introduce an uncertain variable in an unsettled environment, where people typically play by tenuous rules of the game. Many hazards come from mistakes you make, more than events beyond your control. Not only may you hurt yourself, but harm others. The upshot is that a large part of the risk relates to something we do govern: our own course of action.
- No amount of experience guarantees results. A frequent, hazardous misconception is that we get better and better at dealing with insecurity. There is no way of knowing in advance how we will respond to a novel threat, and only limited cause to think that we will react appropriately to a familiar one. A repeat may, for instance, bring up an element of trauma that will radically change the equation. As such, threatening environments demand (and impose) humility.
- Knowing your weak-points is your strength. Paradoxically, the hardest part of working under dangerous conditions can be throwing in the towel. You’re good at what you do—and excited by it. You can’t leave as things are heating up, getting more meaningful professionally. How could you justify giving up on a hunch… before something actually happens to you? Well, that’s precisely the point. The most durable researchers are those that have the courage to turn back when they feel they have reached their personal limits. Exhaustion, overreaction to minor events, a confused sense of dread, are sufficient reasons to pull out. You set the boundaries at least as much as the environment does.
- Movement is a core component of any system. Your awareness is a function of the interface you nurture with your environment, which itself depends on your ability to physically interact with people. Locking yourself into your first circle of friends—or, worse, your home—is a sure sign that you should be seeking ways of pulling out entirely instead.
- Your system works until it doesn’t. The instincts, routines and networks you hone in a certain time and space may be perfectly efficient, and still become a liability as soon as you change locations, roles or timeframes. In particular, deteriorating situations have a certain rhythm demanding sensitivity. You are working in an authoritarian regime, say. Under normal circumstances, only occasional reviews of your assumptions—to reflect the mood of the security services or the domestic balance of forces—are required. If popular discontent rises seriously, you may have to reappraise weekly. When an uprising breaks out, daily updates become mandatory. If you are caught up in the early stages of a war, time can further speed up; you may live hour to hour, if not minute to minute. And then, usually, things slow down again, into a “new normal” that can be both violent and relatively predictable.
- Know where you are and where you’re heading. Most dangers do not occur out of the blue: they ebb and flow according to changes, big or small, occurring within your environment or in your relationship to it. The appropriate response to a dynamic situation resides in understanding the moment you are in by contrast with the phase you are moving into: what is the transition about? This logic also applies to physical movements. Fieldwork, obviously, is safest in familiar contexts. It makes complete sense to explores new areas, but on condition that you are guided and endorsed by people who know them for you.
- Always analyze the unusual. Having grasped the essentials of your environment, it it crucial to stop and appraise whatever may seem out of the ordinary: bizarre calls, unjustified aggressions, a sudden silence in a usually noisy place, and so on. The first goal is to calmly assess whether there is any reason to pay further attention, conscientiously avoiding the two extremes of denial and paranoia. Most often, it is impossible to come up with a quick, satisfactory interpretation; you can then default to a slightly higher level of alertness, until further developments either confirm or dispel your concern.
- Go high-end or low-profile. Staying safe in a precarious environment requires one of two things: insulating yourself with bodyguards, armored vehicles and fortified dwellings; or, on the contrary, shunning all of the above in favor of integration. For a researcher—limited in resources and reliant on face-to-face engagement—the choice is usually clear: invest in friends, neighbors, and a broader network of contacts, who will help you navigate the risks that arise, and provide hints when you get out of your depth. You don’t need to blend in seamlessly if you’re not from the area, and you certainly don’t gain from pretending to be something that you’re not; what saves is trust-based, consistent, genuine and diversified relationships. Small gestures of respect—e.g. modulating your language, appearance and behavior—will help you being accepted for your differences.
- De-romanticize “intelligence.” In authoritarian settings, researchers—especially if they are working on sensitive topics—are often tempted to play spies. They may encrypt documents and emails, use ploys to shake any “tails,” and generally assume that they are under surveillance. Yet efforts to hide from security services will do more harm than good, raising red flags and drawing more attention than would otherwise be the case. A safer bet, then, is to be as transparent as possible on innocuous subject matter while concealing only what is compromising. That often isn’t much and may boil down to the names of interviewees in transcripts, rare pieces of truly damming information, and occasional contact details that may be more problematic than most.
- Tap the savvy that surrounds you. Although each situation is specific, there is no need to constantly reinvent the wheel. The diversity of systems mentioned above is a treasure-trove of ideas you can emulate, tweak, or reject. Many organizations possess enormous collective know-how, which remains underused for lack of discussion and excessive focus on procedures. You will benefit from defining, thanks to the experience of others, the true nature of a particular threat. War, for example, poses above all issues of cash and communications, to inform and fund the constant and costly adjustments it entails. Kidnapping, by contrast, is almost entirely about psychological resilience and external support systems, because there is very little hostages know about their own situation and even less they can do about it.
- Proactive communication is essential. Among the worst things you can do is stand alone in the face of your concerns, fears or actual threats. You’ll probably need help when you find yourself in a fix, and those who can lend a hand will be most effective the more and sooner they know. Your location, travel plans, meeting schedules, contact details, and personal risk assessments can be critical. Whatever it is you must share, and whomever you decide to inform, do so consistently. This also, critically, means building trust between researcher and manager.
From the manager’s perspective, it would be naïve to assume that the person conducting dangerous fieldwork is entirely responsible for his or her own fate; if a researcher is arrested, wounded, kidnapped, killed, or causes harm to people around them, the manager is answerable, in both practical and moral terms. At the same time, the solution does not reside in creating formalistic policies and procedures that tend to focus more on evading responsibility than assuming it.
As a conscientious manager, bear four duties in mind. First, you have an obligation to start with listening. As a rule, the person on the ground understands the environment best, senses how it changes, draws on available resources to manage such inflexions, and knows their own limits—or, in other words, has a “system.” Your surest way of getting a grasp of what is at stake is to hear your staff out and learn from them. This is also essential to gaining their trust, in return for deferring to their experience and judgment.
A second imperative is for you to adjust. If procedures and policies are already enforced within your organization, they must be discussed, explained, and tailored to each and every specific case. Managing security is an iterative process, demanding solutions that don’t generally preexist.
The manager’s third duty is to step in when necessary. Crises experienced in the field sometimes create the need for a steady point of reference—a turning point the manager must prepare for. There can be several reasons to intervene. Any breach of trust on either side of the staff-manager relationship must be overcome before any further research can be conducted. Wavering on the part of the person in the field—conveying a sense of unease, confusion and indecisiveness—is another sign that a manager should put his or her foot down, propose options or make decisions from outside.
Fourth, anticipation is a must. Although the researcher may fail to imagine all possible threats and precautions, the manager has no excuse. A responsible supervisor must lay out, explicitly, the limits regarding what liabilities the organization is willing to bear, at what cost, and to what consequences. He or she must likewise articulate a deliberate policy regarding what insurances—health, life, travel, kidnapping—will or will not be taken out. In other words, the manager should have answers to every possible version of the question “what if?” In the absence of good answers, he or she must be comfortable taking the risk of heading into the unknown.
Sign up to treat yourself to an occasional, thoughtful boost to your productivity! And read our free, digestible ebooks on field-based research, writing skills, communication techniques, self-management and more!
Illustration credit: John Bauer The boy who could not be scared by Wikipedia / licensed by CC.
© Copyright 2017 Synaps. All rights reserved. All use subject to Terms and Conditions of Use set forth here.