Find the Right Open Source Research Tools With Bellingcat’s New Online Investigations Toolkit
Have you ever struggled to find a tool that does exactly what you need? Do you know the feeling of spending hours trying to figure out how to use a tool just to realise that the key features you are interested in are not working anymore, or that the previously free product has turned into a paid one that is more expensive than you can afford?
You are not alone. More than 80 percent of open source researchers that participated in two Bellingcat surveys indicated that finding the right tools can be challenging.
This is where our new Online Investigations Toolkit comes in: it not only helps you discover tools in categories like satellite imagery and maps, social media, transportation or archiving, but is also designed to help researchers learn how to use each tool by providing in-depth descriptions, common use cases and information on requirements and limitations for each toolkit entry.
Most of the tools included can be used for free.
This is the first time in Bellingcat’s 10-year history that we are opening our toolkit to contributions from the wider open source researcher community. A dedicated group of volunteers, our Toolkit Maintainers and Guardians, help us keep this toolkit up to date and are involved in further improving it. Since it is still a work-in-progress, we expect the number of tools in the toolkit to grow over time.
Where Open Source Researchers Search For Tools
Open source researchers frequently use tools, from satellite imagery and flight tracking websites to online business registries and social media scraping services, in their work. Some of these are created by big tech companies, while others are built and shared by volunteers. They might come in the form of desktop tools, command line tools on code sharing platforms like GitHub, or browser extensions.
This fragmented tool environment can be confusing, making it difficult or time-consuming for researchers to figure out which tools to use. In a survey conducted by Bellingcat in 2023, only 15 percent of 153 participants indicated that it was easy for them to find the right tools for their research. A previous Bellingcat survey from 2022 with more than 500 participants yielded very similar results. This is despite the fact that there was a long list of toolkits for open source researchers out there at that point in time, including the previous version of our toolkit, which was available on Google Sheets.
According to Bellingcat’s surveys, most open source researchers use search engines to find tools. Other top methods include going through websites and blogs that present tools and reading the publications of other researchers to see what tools they use.
To get a deeper understanding about how researchers find and use tools, we interviewed 40 open source researchers from various countries, levels of experience and backgrounds. What we learned from those interviews is that what most open source researchers most frequently do is not necessarily what works best for them.
For instance, despite search engines being the top destination for researchers in need of tools, this method does not always lead to the desired results. Our interviewees indicated that it could be difficult to come up with the right keywords: “You sort of know intuitively that certain tools must exist, but you don’t quite know what words to go by. So there’s a lot of feeling around in the dark for search terms to try and find tools that work,” one person explained.
Even if researchers manage to define the right keywords, they struggle to get a sense of whether a tool showing up in the search results might meet their needs. Since tool providers tend to present their tools in an overly positive light, it is often difficult for researchers to understand a tool’s limitations based on the descriptions.
From our interviews and surveys, direct tool recommendations from others in the industry are an important way that researchers make sense of the diverse tools available online. “In my network, I know a lot of journalists who do this kind of OSINT (open source intelligence) work,” one interviewee explained. “Just through word of mouth, I’m able to get recommendations for what the most up-to-date tools are.”
Other interviewees indicated that they turned to websites, blogs and newsletters to get recommendations for tools. “If someone has used a tool and there’s a blog out there that tells me how to use it, then I will try it,” one open source researcher said.
No matter where they find tools for their research, many researchers said they struggled with keeping track of tools that might come in handy later on. “Sometimes I feel like ‘Oh, that might be an interesting one’, but I don’t use it now and when I need to find it, I just really don’t remember,” one interviewee said.
Toolkits are meant to be a solution to this. They are usually organised according to different tool categories and curated by a person or organisation hoping to provide some structure within the open source research tool environment.
Googling keywords like “OSINT” and “toolkit” brings up a long list of toolkits that are all available for free. However, barely any of the open source researchers that we interviewed said toolkits were their preferred way of (re)discovering tools while doing research.
A Toolkit Wishlist
The number one reason our interviewees said they did not use toolkits often was that they felt most of these are not kept up to date on a regular basis.
Eight out of our 40 interviewees suggested adopting a collaborative approach to keep toolkits relevant, and therefore useful, for researchers. One open source researcher said that in her view, if any one individual was in charge of keeping a toolkit regularly updated they would “probably go insane” doing it, so “having the courage to let people own different bits of it” would be the only feasible way, she said.
Another popular request by open source researchers was to include or link to guides that explain how to use each tool. “If you are developing a toolkit, then I would personally expect to have some kind of explanation on how to use it,” one of our interviewees said. Many toolkits, however, do not focus on providing or listing guides and are therefore only of limited use for open source researchers.
In addition, some researchers expressed the importance of receiving clear information on the limitations of each tool and the costs, if any. “I prefer accessibility and open source tools, but for sure, if a paid example exists, I like to know that it exists. I just like that to be really clearly marked,” one interviewee explained. Another person considered it as important to see right away whether a tool has “a thousand dollar license that’s unsustainable for small organisations”.
There are open source researchers all around the world who speak a wide range of languages, so several of our interviewees pointed out that toolkits should not only take the needs of researchers in English-speaking and Western countries into account. “Expansion to cover more non-Western social media platforms is always appreciated,” one interviewee said. Some of our interviewees suggested offering sub-categories with tools for specific regions or countries.
Finally, our interviewees said it would be good to receive guidance on choosing the right tools from within a toolkit. Thirteen out of our 40 interviewees were enthusiastic about the idea of an AI assistant that would either ask them guiding questions or allow them to type in questions or pieces of information they already have for it to provide specific tool suggestions based on this input.
Using Bellingcat’s Collaborative Toolkit
Based on these learnings, we designed a completely new version of Bellingcat’s Online Investigation Toolkit. It is still a work in progress and we expect to expand it over time with the help of the wider open source researcher community.
If you click on a specific category, for instance “Maps & Satellites” → “Maps” you will see all available tools listed in alphabetical order. You also see a short tool description and information on whether the tool is paid or can be used for free. Tools that have some free and some paid features are marked as “partially free”.
On the right hand side, you see a column called “Details”. Click on “Details” next to your tool of interest (if available) and you will be brought to the section into which we have invested most of our efforts: an in-depth description of the respective tool with tips and tricks on how to use it.
The tool descriptions are written by our volunteer community, Bellingcat staff and members of the wider open source researcher community. They each have individual styles and lengths but all follow the following structure:
URL: | The URL to the tool |
Description: | A full description of the tool including the answer to the question: what problem does it solve? |
Cost: | Is the tool free, partially free or paid? |
Level of difficulty: | How difficult is it to learn how to use the tool? |
Requirements: | Are there any requirements for using the tool? |
Limitations: | What limitations does the tool have? |
Ethical Considerations: | What ethical considerations might be relevant when using the tool? |
Guides and Articles: | Links to guides on how to use the tools or links to research that was done with this tool. |
We developed this structure based on the priorities that were expressed during our interviews with open source researchers. It aims to cover the aspects most relevant for researchers when making a decision on whether they want to use a tool for a specific research task.
Our toolkit also includes a natural language search interface powered by OpenAI. To use it, just type in a question in the search box on the upper right corner of the screen and see what you get. For example, this is what it told us when we asked for the best tools for beginners:
The natural language search can also suggest tools for very specific tasks. Here is an example:
You can even try to use it to get step-by-step instructions for specific research tasks:
Please be aware that the answers are based on the tool descriptions in our toolkit and the quality of the answers heavily depends on whether an answer to the question you are typing in is represented in the toolkit. We expect that the number of tool descriptions, and therefore the available information in the toolkit, will grow over time.
How You Can Contribute
Bellingcat’s new Online Investigations toolkit is collaborative. We aim to create a resource that brings together the joint wisdom of the open source researcher community to make the task of finding tools less daunting for everyone.
The backbone of our toolkit is a select group of Bellingcat volunteers, who fall into two major groups. Our Toolkit Maintainers write and maintain tool descriptions and if one of “their” tools stops working or is adding a new feature, they are responsible for adding these updates. Our Toolkit Guardians take on even more responsibilities, keeping an eye on a whole category of tools and supporting us in defining the future development of the toolkit to make sure it meets the needs of open source researchers.
We welcome contributions from the wider open source researcher community as well. If you would like to contribute, there are several ways:
- Provide feedback on our toolkit. This helps us gain new ideas on how we can make this toolkit even better.
- If you feel that a specific tool is missing, you can submit a description of it via this form. We cannot guarantee that we will include it in the toolkit, but we promise that we seriously consider every suggestion.
- If you are representing a newsroom, a university or an independent research organisation and would like to contribute to this toolkit, feel free to get in touch via toolkit@bellingcat.com.
- You can also apply to join Bellingcat’s Volunteer Community. If selected, you will be able to contribute to this and many other projects as part of an active group of open source research enthusiasts.
Bellingcat’s new Online Investigations Toolkit was developed by Johanna Wild during her 2024 Nieman-Berkman Klein Fellowship in Journalism Innovation at Harvard University. Cooper-Morgan Bryant, a student research assistant (Harvard), contributed to the user research. Viktorija Ignataviciute and Galen Reich contributed to defining the volunteer involvement, with Viktorija Ignataviciute also supporting the toolkit volunteer community on an ongoing basis.
Bellingcat is a non-profit and the ability to carry out our work is dependent on the kind support of individual donors. If you would like to support our work, you can do so here. You can also subscribe to our Patreon channel here. Subscribe to our Newsletter and follow us on Twitter here and Mastodon here.