This article is broken into two sections. The first part details the past decade of protest communications while providing an outline of how a cyber protest network operates in 2020. The second half details OPSEC as well as OSINT techniques on the apps: Telegram, Signal, Facebook Messenger, SMS, Snapchat, and Twitter.
History of Protest Communication
In January 2011 the Arab Spring, as it would be later termed, saw its first success in Tunisia with President Zine El Abidine Ben Ali’s 23-year rule coming to a spectacular end. Protestors who had taken to the streets during December 2010 had forced President Zine El Abidine to evacuate to Saudi Arabia where he would watch his autocracy crumble. While the country still struggles – particularly human rights issues – it remains the sole democratic nation in North Africa with over 100 political parties. Aside from the sweeping demands for democracy being a key to driving people into the streets, there was another factor that had not been present in past protests and revolutions: social media.
The Tunisian Revolution’s use of social media was a factor that helped to disseminate post facto information, although the role of social media in organising the Arab Spring is often debated. PONARS Eurasia Policy Memo No. 159 gave credence to the use of social media communication, but downplayed there was any direct lesson or trend. Sources inside of the United Kingdom Department of Defence indicate that initial reports downplayed the role of social media, and in retrospect, still do. A majority of social media use was not cohesive planning between activists. Rather activists used messaging to amplify their message, get more cursory interest, and activate people outside of Tunisia.
The momentum poured into Egypt, where social media did turn online activism into street activism. A call on Facebook turned one event, into several more. Once people were on the street, traditional grassroots organisation spread among those who would have never arrived without the calls on Facebook. The momentum spread with urgency and people stayed out on the streets.
Still, it was passion, and not social media, that kept people on the streets. Several more years would pass for online organisation to outweigh traditional grassroots organisation.
In the west, Occupy Wall Street in 2011 was the first movement that saw heavy influence from social media. And just as Tunisia and Egypt, it served chiefly as a recruitment tool for generic activism (meet here and march), an ideological torch, and social connector. Social media was, in essence, acting as the modern Salons de Paris and political clubs which fueled the French Revolution or the Abriksovok Candy Factory that was an ideological symbol for the Bolshevik Revolution.
The early days of social media influencing protests were marred by the fact only a few understood how to use these tools for effective and secure communication; much less, how they could reach an intended audience. 2011-2014 were the years protestors practiced marketing their actions and messages anonymously and securely. Most who dabbled in it were ultimately monitored by federal authorities and let information slip and slide as their only goal was to create a wildfire of information.
Over the next five years internet anonymity guides were tested and perfected – OSINT became mainstream in work forces of multiple disciplines – and in the wake of Snowden leaks, messaging securely became seen as a human rights issue. All of these factors contributed to a situation where an app named Telegram rose in popularity for everyday use, and became a feature of the Hong Kong anti-ELAB (more general, the 2019 Hong Kong pro-democracy movement) protests.
Hong Kong Telegram was a central communications channel for protestors; an app easily loaded onto anyone’s phone that would allow people to, theoretically, remain anonymous. At first the app was fairly impenetrable by police and authorities, although this would quickly change. Telegram has indicated they are refusing all data requests from the Hong Kong Police, but that has not stopped Hong Kong Police from identifying alleged Telegram administrators, arresting and charging them, then shutting down the channel for violating the National Security Law.
Inside of Telegram, channels would host information about calls to action, police movement before and during the protests, activist tips such as how to prepare for tear gas and what to do/not to do during a protest, and a fast-paced newswire to match the city’s ever-changing landscape. As more and more of Hong Kong’s population began to have an inherent distrust of the police, observation channels monitoring police activity became some of the most popular and cohesive channels. Channel administrators would receive and verify pictures and locations of police, which in turn would be used to either signal people away from an area, especially as arrests became increasingly randomized, or to an area if a smaller protest was being disrupted.
The other major benefit to the Hong Kong Telegram network were the protesting guide channels and the Yellow Economic Hub. The protesting guides provided details on how to secure Telegram, what apps to be warry of, and maintaining clean OPSEC in a mobile environment. These channels also made people aware of what to prepare in case of injury, where medics could be found, and, more importantly, what not to do to make an injury worse. Channels would prepare people with knowledge on proper gas masks, goggles, and uniforms. Methods of safe protesting, distinguishing tear gas, and building protective barricades were generated and rapidly shared, implicating the protests would evolve not over a week but a single day. Due to the popularity of these channels they unintentionally initiated a grassroots movement for proper mask wearing and hand sanitizing during the early stages of the COVID-19 pandemic; however, channels would occasionally fall for the rampant conspiracy theories issued.
The Yellow Economic channels referred to the “yellow ribbon” shops that supported the pro-democracy movement opposed to the “blue ribbon” side that supported the Hong Kong government and police. Although the passing of the National Security Law forced this “yellow” shop network underground, organising an economy that supported businesses who supported the pro-democracy movement was essential for a method of protesting ‘everyday’. Patronizing these yellow restaurants also allowed them to continue providing free meals and cover for protestors during weekends.
The Telegram network has since been incorporated around the world, becoming popular in the United States of America for organising the ongoing “Black Lives Matter” protests, less popular but important in Lebanon, gaining some traction in a number of Latin American countries, and becoming an important messaging tool for the Belarus protests. Across the world the channels are very much the same – providing different people different ways to protest, how to protect oneself, and passing along protest propaganda.
The remainder of this article will focus on different types of communication and working through OSINT scenarios.
Apps with Inherently Strong OPSEC
Telegram and signal are the two apps with inherently strong OPSEC. While two is a very small number, most other apps are designed to profit of user information or put UI/experience/entertainment above security. While those apps can be ‘secured’ it is far safer to trust environments that protect the user by default and require human error to break OPSEC. Other apps must be carefully maintained to establish even half decent OPSEC environments.
However, any environment is only as secure as the person allows it to be.
Warning – there is a lot to learn about Telegram. I cannot explain every single bit here and I advocate that sometimes the best way to learn is experience and asking questions.
Operating off the principle that an application is only as secure as the environment it is in, start using Telegram with a burner phone. This is particularly important in the United States where phone numbers are often tied to convoluted, multiple-year plans and digital sims, which tie a phone number to a specific phone, are in vogue. Still, plenty of phone companies offer sim cards that can be purchased for a relatively reasonable price – and that price is much better than putting oneself at risk and everyone else in the group. For reasons that will become evident in the OSINT investigation part below, a Telegram group is only as secure as the weakest member.
Complete the phone set up via a wireless café network and not work or home, which could leave data vulnerable. Phone’s broadcast a MAC address when attempting to look for a wi-fi connection, which means upon connecting to a network once the phone will signal that work until the network has been deleted from the phone.
One last step: download a VPN. Proton VPN is a strong recommendation as they do not keep logs on their customers and their code is open-source, which allows quick bug detection. There is a free option as well but the premium subscription is worth the money if you are doing OSINT investigations regularly.
No, this is not an ad. Recommending Proton VPN is based on my own experience and if you have a personal favourite, please let me know!
After securing the phone’s network with a VPN, start using Telegram on the burner phone and keep the phone as clean as possible. Telegram has a web browser version and a desktop version. However, web browsers can become easily tainted with malware and desktop environments should be scrubbed before utilising Telegram or beginning any OSINT investigation. For those extra worried, learn how to use a virtual machine and install a simple version of Ubuntu for an extra layer of security if using desktop Telegram is a must.
The first step to take is something not to do – DO NOT sync contacts on Telegram unless the contact folder is being used for an investigation. Upon syncing Telegram contacts if a contact is on Telegram, they will be notified that you too are on Telegram, and vice versa. This is extremely convenient for general use of Telegram, but not so useful when attempting to remain anonymous. Furthermore, this is very useful when investigating a particular person and finding their networks on Telegram is the goal. Load the contact list with the phone numbers of persons of interest and sync that on Telegram. An efficient way to pivot to the dark web is through Telegram.
Once on Telegram the search bar provides the easiest and most direct way to become active on the platform. There are three kinds of groups – channels, basic groups, and public groups. Channels are run by administrators and reply bots, which allow only a few people to post. Several governmental health administrations, such as Ukraine, have taken advantage of the channel features, establishing informational channels at the beginning of the pandemic. Basic groups have a maximum of 200 members and are by default not public. Super groups have a maximum of 100,000 users and the administrators can choose to be either public (can find via search) or private (invite only).
Groups are where people talk and where bots become very helpful. I will not go into depth about how to make bots – there are already plenty of guides on this and I do not need another 3,000 words here – but bots can conduct polls, help verify people, take payment, mute people, kick trolls from a public group, update maps, provide social service information, create stickers, create a wiki, organise music, and so on. The limitations of bots are only the limitations of creativity. Bots are incredibly powerful and a key feature of Telegram that make the app powerful and limitless.
Bots, just as every user and group have an ID and title with an optional username and photo. Photos on Telegram can be used to reverse engineer public identity (i.e. take a unique profile picture – even of an innocuous item – and there are strong odds it will trace back to someone’s Twitter, Instagram, or Facebook). Usernames are also occasionally indexed by Google, especially if the user is a power user who is an administrator on multiple channels, sharing plenty of invite links, and creating bots. Usernames can be searched on google through the term:
https://t.me/joinchat/<hash value> -site:telegram.org
This search method will allow for an OSINT investigator to trace where links are being shared and by whom, allowing for a larger network to be traced. This is especially helpful in spotting connections between malicious troll groups and the more prevalent darknet groups that are using Telegram to expand their services.
Another, often simpler way to trace a network is to join a larger channel and work into smaller networks. There are plenty of ways to access a larger channel for activism or, again, criminals by using search. Over time people will forward messages from an active group into public channels, at which time you can click on the channel or group the message was forwarded from and join that channel. Repeat the same process and slowly work from a larger channel or group into smaller groups that are not as accessible. Criminals make mistakes and eventually private telegram groups will show their face in public groups, giving OSINT investigators a chance to get to the source of an entire Telegram network.
This may require some thorough conversations and convincing, however, which will require preparation and work to spot the why and the what of the target on Telegram. In this instance there should also be an important stress on a clean, burner environment. Professional trolls and dark net criminals will not appreciate that an OSINT investigator has pivoted into their network. Again, this is not a task for those inexperienced with Telegram or in OSINT.
Once a target channel or group has been identified there are two further investigation steps to take: source the data and preserve the data for analysis. Intelx.io and Tlgram.eu/channels are incredibly powerful tools for finding channels and groups to start an investigation point. For specific messages on Telegram search.buzz.im might be a more focused path, but requires pre knowledge of the topic or group being investigated.
The following topics are all covered on the Leveraging Telegram webinar from SANS Institute, which is being linked to twice because it is that good and important of a presentation.
One of the simplest tools to preserve Telegram history is PigPagNet, which can be used to scrape entire conversation history. The concept of pivoting is particularly important here as identifying a power user can be a massive break in an investigation, allowing for an entire portion of a network to be revealed.
GitHub user FableDowl has created two other tools that should be used – another Telegram information scraper and a user scraper. Recall that each user has a unique identifier – these tools reveal those IDs and give access to usernames, which, again, are important for pivoting and building a concept map of a larger network. Groups and channels also have IDs that are important to record and can only be accessed with these tools.
In principle, is Telegram encrypted and a safe environment to network in? Yes. Bots, channels, and knowledgeable administrators add to the Telegram experience that make it a one of a kind app. However, there are still risks to using Telegram. Knowing how to conduct an OSINT investigation on Telegram can greatly increase security knowledge for administrators and end users. While Telegram seems to be impenetrable for an OSINT investigation, there are multiple ways to navigate Telegram that will enhance an investigation. Afterall, if people feel more secure on Telegram, odds are the information on Telegram being shared will be crucial for an OSINT investigation.
When WhatsApp co-founder provided 50 million USD for the production of the messaging app Signal, encrypted communications went from niche to a feature users could easily access – as they should be able to. An open-source code allowing for vulnerabilities to be quickly found by eager white-hat hackers and Signal’s end-to-end encryption which prevents the company or anyone else from reading messages on the app make it the best peer-to-peer communications app. Although, it demands repeating: an app is only as safe as the environment it operates in. While Signal does not offer the large-scale networks and bots like Telegram does the ease of access and use make it a perfect messaging app for smaller groups and peer-to-peer. friends. Signal is a case where simple and minimalistic design shines.
Signal has three other security features that add onto an already strong end-to-end encryption – disappearing messages, media management, and encrypted calls.
The disappearing messages are exactly as they sound – set a timer for messages and they will disappear after the time expires. Be aware this feature only works fully with other Signal users; the end-to-end encryption allure of the app is negated even if default messaging goes through Signal but the recipient does not use Signal.
Media management has two fantastic features, blur and destruction. Signal can take pictures that are not saved within the camera roll and therefore do not backup to the phone’s default cloud. Pictures can also be set to be destroyed after a certain amount of time, as with messages. The new blurring option will auto-detect faces and blur them in pictures with manual adjustments if only one face needs to be blurred or the AI detection fails.
Encrypted calls or video calls allow for calling to go through Signal’s servers – encrypted – which blocks callers IP addresses.
Again, Signal is only as secure as the environment and the people it is connected to. Even though Edward Snowden uses Signal, do not simply connect with someone claiming to be Edward Snowden and offering you some bitcoin.
Apps that Bleed Secure Information
As the title suggests, these are apps which bleed user information for the fun of it. These are very hard to secure and should swiftly be moved away from for communication – yes, even grandma and grandpa should use Telegram opposed to Facebook Messenger. Less ad spam and actual scams move through a close system opposed to a system designed with features that thrill scammers.
WhatsApp promotes end-to-end encryption. WhatsApp also has a new vulnerability every other week. While that is slightly hyperbolic, finding back doors and vulnerabilities in WhatsApp is a regular practice for hackers. In February of 2020 a specifically egregious flaw was found when ‘private’ groups could be found via a Google search. Although WhatsApp groups are no longer indexed, some can still be found when browsing through archives. Furthermore, WhatsApp is managed through the same network as Instagram Messaging and Facebook Messenger. The three apps have not merged as planned, but on the ‘backend’ they might as well be the same.
Assume that what is sent through WhatsApp can be read.
Furthermore, WhatsApp is more vulnerable to malware and spoofing than other messaging services. This occurs most often when a package is downloaded onto a desktop environment that appears to be the WhatsApp desktop application, and functions as advertised, but steals credentials when signing in. This vulnerability was recently used to target and monitor Kurdish populations in Turkey and Syria.
There is also a particularly troubling backup setting that seems to be default. Any pictures or videos shared on WhatsApp will be archived locally, unencrypted. If you are sharing anything potentially confidential, it will appear in your local folders until you delete it and turn off automatic sync.
WhatsApp is only slightly more secure than Facebook Messenger. In addition to the above concerns, Facebook Messenger opens the door to more malware, spam, location monitoring, and intrusive ads.
Facebook Messaging ads were first introduced in 2017, becoming a very intrusive and lucrative part of Facebook. Reading guides on how marketing and businesses utilise the power of Messenger Ads is even more revealing into how much the backend is prying into personal information. There is always an unknown of how much data these apps are taking, but assuming they are using location with a not-so-friendly AI reading messages to pick out what ads are shown is not a stretch.
The other threat is malware and credential theft. While a May 2020 threat applies to the Facebook App, the same model of attack can be used to target the messaging app. An attacker will steal a stored ‘cookie’ with login credentials and then insert that ‘cookie’ into a proxy server thus making Facebook assume the sign in is occurring on the same device.
Malware for Messenger can be the same type that affects WhatsApp, with one case this year being labelled as a serious threat. WolfRAT malware infects phones via mirroring legitimate services – including Google Play – and then steals credentials, messages, photos, and contacts through Messenger and Whatsapp. WolfRAT is also being developed to continually combat any potential patches. When using a third-party app store to download messaging apps the risk of packaged malware increases exponentially.
SMS Texting has flaws in the fundamental way messages are transferred. I could rehash everything that has been written, but just don’t use SMS texting. This article explains the security flaws that SMS texting exposes users to.
Assuming Snapchats simply disappear, or that stories don’t leak, or that location data is not being tracked is incredibly foolish. Snapchat might be one of the leakiest applications with security features that are only protected if employees decide not to abuse them – and allegedly abuse those features, they did. Snapchat employees allegedly used a backend tool reserved for law enforcement investigations to access user pictures and information in May 2019. The very existence of that backend tool is a glaring vulnerability and is only waiting for a hacker to get in and leak thousands of pictures. With recent news about Twitter being hacked by a simple phishing attack to gain administrative tools, Snapchat very easily could be next. A third-party app was hacked in 2014 which released pictures from 200,000 snapchat accounts; a breach occurring only months after a brute force account exposed usernames, passwords, and phone numbers of 4.6 million users.
While there are some general rules about enabling two-factor authentication with a phone number, that only provides Snapchat with the user’s phone number. Never mind that OSINT tools abound that take advantage of Snapchats default and guided settings which lead users to exposing their location and adding Snapchat stories to a world map.
Snapchat Map is an incredibly useful OSINT tool for getting direct footage of a location halfway around the world. While it is difficult to identify specific users using Snapchat Map, obtaining video of an event can advance knowledge in an investigation. Story Snapchat is efficient for finding specific users and can be particularly helpful in missing person cases or obtaining more information regarding a suspect. OSINT Curious has a helpful step-by-step guide on how to properly use these tools. Usernames can also be identified using the open-source tool Sherlock.
Twitter’s recent history of data leaks and hacking should make it evident messaging on Twitter, much less accounts, are vulnerable. Having information exposed from a major social media app is a matter of when, not if. In November 2019 two Saudi Arabian Twitter employees were charged with spying for the Saudi government, making it evident Twitter messages and account information are only secure as much as their employees are honourable. Prosecutors, however, are mulling dropping these charges.
The next major breach was on 16 July when the aforementioned phishing scheme was unleashed, providing hackers with administrative controls. Popular accounts were then used to post a tweet alluding they were going to give Bitcoin to their followers – only after receiving bitcoin first. Although charges are being leveled against a group of teenagers, the access the teenagers gained could have compromised a dizzying amount of information.
A week later twitter once again proved they had major human and technological security gaps when 36 political accounts were breached and had information stolen. The politicians involved have gone unnamed, aside from Dutch politician Geert Wilders whose profile picture was replaced – the only lead revealing the accounts were breached.
Twitter is not secure. While proper OPSEC should be followed when creating an account (not uploading personal pictures that reveal your exact location, house, friends, activities, vacations, et al material that can be used by the more nefarious minded), it might be advised to just follow OSINT Sock Puppet account rules when creating an account to post about your cats or latest meal.
To begin OSINT research on Twitter start by muting the algorithm – this guide from OSINT Curious explains how. As other social media goes, using Twitter as a single source for an investigation will deliver only part of the picture. Tracking users across multiple sites will provide a better picture, and while there are multiple ways to accomplish this, Sherlock is a personal favourite. While Twitter may not reveal the information needed right off the bat, explore a user’s pictures for additional information and check the category under replies. First accounts followed may also hint if the account is a puppet for someone whose real identity is on Twitter – even if their real name is not on Twitter, their families and close network may lead to an even bigger find. Simply put, never stop pivoting and always keep working down the OSINT flowchart.
So What? The Khashoggi Case
Using insecure communication opens a user up to a plethora of malware, ransom ware, and spy ware. The bottom line is this: Once a device is compromised, all apps on that device are compromised. Encrypted communication becomes worthless if spyware is just watching the user type.
The brutality of spyware was revealed in the recent assassination of Saudi Arabian dissident and Washington Post reporter Jamal Khashoggi. Autocratic and authoritarian governments are using spyware as a common tactic to combat any act they deem treacherous or terroristic – even if that act is being a bit too vocal. The Pegasus spyware used to monitor Khashoggi was allegedly developed by NSO and used in the kingdom. While NSO has never confirmed they sold their spyware to the kingdom, they did state they freely sell their wear to “combat terrorism”. An Israeli judge also dismissed an attempt for a lawsuit against NSO Group to be dismissed.
So, what happened to Khashoggi? His phone was infected by Pegasus; likely an invite to an event or a message that seemed innocuous crossed his messages and he clicked it, curious and not aware of the powerful tool that would jailbreak his iPhone moments later. Upon jailbreaking the iPhone, the owners of Pegasus could read his text messages, gather information from all other apps, even those which were encrypted. Not only was Khashoggi breached, but all of his contacts were vulnerable too. An entire human rights group was compromised because of one link.
Spyware is that powerful. Security is that important. That is why all of the words above are important.
One last thing – if you are still reading please take a moment to ensure your apps and computer is up to date. Out of date systems might still be exposed to zero day or unpatched vulnerabilities. It takes two minutes and prevents you from a possible life changing nightmare.