How do we know when conservation crimes are occurring? The answer to this question very much depends on which detection methods we employ. To get a better handle on this, I offer you my own typology of detection methods (bottom of the page), which is itself an adaptation and refinement of existing typologies.
The Evolution of a Typology
Over the past decade, a number of typologies have been created to help us think about methods to detect the illegal use of natural resources. Noticeably, they all build on themes first presented by Gavin et al. (2010) after conducting a review of published literature. In this original typology, eight different detection methods are discussed in terms of their possible data outputs (what, where, who, why), relative labor demand, relative technology and training requirements, data bias, and possible controls of data bias. The specified methods in the Gavin et al. typology are: law-enforcement records, indirect observation, self-reporting, direct observation, direct questioning, randomized response technique, forensics, and modeling.
Subsequent to this contribution, the Gavin et al. typology has been further refined by two additional publications (Arias 2015; Bergseth et al. 2015), but the core concepts have largely remained the same. In the case of Bergseth et al. (2015), the typology was modified for application to marine protected area management, while Arias (2015) sought to expand the typology to consider additional data outputs, including the full 5ws (who, what, where, when, why), other data types (e.g., impacts), and common measurement units.
In spite of these updates, I still find Gavin et al. (2010) to be the best published typology of detection methods. But, I honestly have yet to find a typology whose design incorporates one the most important design principles for typologies: Mutually-Exclusive, Collectively Exhaustive (aka, MECE). That is, a good typology should help criminologists and conservationists discuss detection methods with both terminological specificity and confidence that most methods have been considered.
Comparing Methodological Perspectives
The challenges to developing a good MECE typology are well-illustrated through comparative analysis of the three published typologies in terms of their overlaps and divergences. In particular, three themes standout after careful review:
- An Expanding Set of Methods. In their updates to Gavin et al. (2010)'s typology, both Arias (2015) and Bergseth et al. (2015) implicitly argue that two important detection methods - expert opinion and remote sensing - should also be included. For my own typology, I considered these two methods to be broadly relevant and to have limited overlap with other detection methods, although expert opinion can strongly support modeling (e.g., Pitcher et al. 2002).
- Continued Conflation of Methods with Techniques. The typology provided by Arias (2015) implicitly argues that both Gavin et al. (2010) and Bergseth et al. (2015) both incorrectly consider techniques - or the nuts and bolts of how a method is applied - as themselves specific types of methods. I agree with Arias here. Gavin et al. (2010) identifies "randomized response technique (RRT)" from "direct questioning", yet RRT is only really used with direct questioning (i.e., large-scale surveys). Bergseth et al. (2015), meanwhile, identifies "the false consensus effect" and "flight initiation distance" as methods, yet these concepts seem far more relevant for the analysis of compliance data than its collection.
- Reduced of Comparative Description. Where Gavin et al. (2010) felt comfortable providing detailed comparative descriptions of methods, including the biases of their data outputs and the labor and technological requirements, Bergseth et al. (2015) and Arias et al. (2015) both offer relatively scaled-down comparative descriptions. This implies they find it difficult to compare methods outside of context, and I would agree with this perspective. However, I would suggest that neither updates are sufficiently critical of our current comparative knowledge. For instance, Arias (2015, p. 138) indicates that many methods - including law enforcement records and direct observation - cannot be used to understand the "why" of conservation crime. However, this perspective is very much dependent upon the researcher's theoretical perspective. For instance, spatial analysis of law enforcement records combined with direct observation of crime hot spots may well reveal important environmental factors that enable conservation crimes.
A "BAGG" Typology of Detection Methods
To help move the conservation field forward, I offer the following typology that builds upon the typologies of Bergseth et al., (2015), Arias (2015), and Gavin et al. (2010), and incorporates a few ideas and design preferences of this intrepid author (Gibson). My perspective is that this core typology is more comprehensive, has less internal conceptual overlap, and better avoids making unfounded comparative claims than past typologies.
What do you think? Feel free to leave your comments below.
Looking to the future, my expectation is that conservation crime methodologists may further develop our knowledge of detection methods through design of sub-typologies of techniques. For instance, one sub-typology I find useful in my teaching is a list of Direct Questioning techniques (e.g., large-scale structured surveys, in-depth interviewing, informal questioning).
And looking even further to the future, it is my hope that additional guidance may be developed on how to best combine detection methods to "triangulate" what is truly occurring.
- Arias, A. (2015). Understanding and managing compliance in the nature conservation context. Journal of Environmental Management, 153, 134-143. Closed-Access Link.
- Bergseth, B. J., Russ, G. R., & Cinner, J. E. (2015). Measuring and monitoring compliance in no‐take marine reserves. Fish and Fisheries, 16(2), 240-258. Closed-Access Link.
- Gavin, M. C., Solomon, J. N., & Blank, S. G. (2010). Measuring and monitoring illegal use of natural resources. Conservation Biology, 24(1), 89-100. Open-Access Link.
Header Photo Credit: Ranger in Tarangire National Park, DepositPhotos.com