I use yubikeys to store my gpg and ssh keys.
Before starting do a little bit of reading to familiarize yourself with the setup procedure. I have added a list of links at the end. These are mainly the resources that I used.
Generate a new gpg key
I have generated my keys on a qube VM without internet connection.
$ gpg --gen-key
Selected option 0 and moved on to create my ID associated with the key.
In this step I used mostly the guide from yubico developers website  The guide goes through generating Sign (S) Authentication (A) and Encryption (E) keys.
Add an authentication key
$ gpg --expert --edit-key 123ABC45
At this step we select another RSA key to attach to our key. In the gpg selection menu this corresponds to option 8.
Here is where you should backup your keys and revocation certificates. Please do I have personally lost yubikeys and having backups really helps.
Also setup a PIN and a admin PIN for your yubikey . With:
$ gpg --card-edit $ gpg/card> admin
Import the key to the yubikey
Finally we edit our key and add it to the keycard .
$ gpg --expert --edit-key 123ABC45 $ gpg> keytocard
Now your key is exported to your card and ready to be used.
Setup key to be used with ssh
$ gpg2 -K --with-keygrip
This will show all your keys available with keygrip.
Use the keygrip of your authentication key to export it to
echo 1234567AB8CDFFF90G9H1I23JJ4K5L67M89N012O > ~/.gnupg/sshcontrol
I have also added the following to my
default-cache-ttl 600 max-cache-ttl 7200 enable-ssh-support write-env-file ~/.gpg-agent-info
And edited my
gpg-connect-agent /bye export SSH_AUTH_SOCK=$(gpgconf --list-dirs agent-ssh-socket)
You can now:
$ source ~/.bashrc $ ssh-add -l
This should list your new key.
A lot of information is given away by certain properties of communications beyond just the content. These properties are usually referred to as communication metadata. Metadata include information such as the lenght of the conversation, who was involved, where are the parties involved in the conversation located, and so on.
Questions that can be answered by looking at metadata are:
- how long you talk, how often ...
- the people you talk to, or a group of people that communicate between eachothers frequently ...
- network addresses, traffic patterns
- location services, geo-coding ...
As you see Metadata give a lot away about you even if the actual content of your communication is encrypted. We could also say that the metadata can give away more information than reading the actual content, and in a form that is easily processed by machines.
Tor is an important tool providing privacy and anonymity online. The property of anonymity itself is more than just providing an encrypted connection between the source and the destination of a given conversation. Encryption only prevents the content of the communication between two parties from becoming known, but there is a lot of information that can still be learned by just observing the traffic.
When you use the Tor network, you are not sharing your communication metadata. So that, not only your communication are encrypted, but you can also stay anonymous. Anonymity is a broad concept, and it can mean different things to different groups.
The main advertised property of the Tor network is that it provides strong anonymity given a variety of people using the network. For the Tor network to function properly and to satisfy users' needs, we need a certain degree of diversity. We need diversity in the relays comprising the network and in the user population sending traffic through it. We want Tor to be able to reach and serve a diverse population of users and use cases. We believe everyone should be able to browse the web and enjoy privacy, independently of where they live and who they are.
But hiding network traffic is not all you have to do to be protect websites and services from tracking you online. There is a lot of information that is given away when you use your google or facebook accounts to login to a third party service.
The risk in this case doesn't only involve you transferring your profile information to a third party service, but also Facebook and Google learning about how and which third party services you use.
Tor provides a technology called onion services that allows users to offer various kinds of services, such as web publishing or an instant messaging server making that available on the Tor network. Using Tor "rendezvous points," other Tor users can connect to these .onion services, formerly known as hidden services, each without knowing the other's network identity.
An .onion service needs to advertise its existence in the Tor network before clients will be able to contact it. Therefore, the service randomly picks some relays, builds circuits to them, and asks them to act as introduction points by telling them its public key. By using a full Tor circuit, it's hard for anyone to associate an introduction point with the .onion server's IP address. While the introduction points and others are told the onion service's identity (public key), we don't want them to learn about the onion server's location (IP address).
In a-not-so imaginary more privacy friendly web, applications could be made available through onion services and we could use different identity providers that wouldn't necessarly need to access and share all our data, therefore without us losing control over our online footprints.
Recently I have have learned about Solid. Solid can be defined as a set of protocols leveraging on the idea of the semantic web to allow online applications to access user data. Building on top of hypermedia protocols, Solid is completely compatible with the current technologies of the web. With Solid all your data are stored in an online Solid Pod. The data is owned by the user who is always free to move it as they wanted.
Users can give other users or services permission to read or write to their POD. This approach is great for services' developers while also protecting user privacy. Ultimately the user retains control on their pod.
I have created a test application running on onion services and connecting to a solid community pod. This example is particularly interesting, because by using the .onion protocol the solid pod doesn't know about the app retrieving data. They will just see their .onion address.
The code is available via the myonion wrapper. An experiment in making available through docker containers different app via .onion services.
In this post I have shown how a web application made available through .onion on the Tor network can exchange information with a Solid pod and allow a user to login and share some data. Because .onion services live on the Tor network, you do not need hosting or a public ip address to offer some service via .onion address. This means .onion services are a gateway to a decentralised, peer-to-peer internet, where you regain control on the content you create and who you are sharing it with. The .onion can even be hosted on your computer for the time you desire, allowing the people using your service to remain anonymous, and also you.
I believe anonymity to be very important since it can free people, allowing them to decide how to expose themselves or to make themselves visible on their own terms.
I have been studying twitter political propaganda machine for a while now, trying to identify patterns and spot interesting behaviors. Here is what I found.
First of all the raw data can be accessed here
The spreadsheet above contains a number of keywords, most of these keywords are about controversial topics. With controversial I mean topics that would surely spark a debate or polarizing opinions.
The list of topics (or keywords) isn't in any way complete and is mainly centered around propaganda for the Spanish elections happening this coming weekend.
My intention was showing that the political right wing "clamore" was just marketing generated content, crafted by agencies. Therefore I have collected tweets from five specific location around the world, namely: Caracas, Venezuela, New York, US, Miami, US, Guadalajara, Mexico, Haifa, Israel.
There isn't a big rationale around why I have chosen these cities, and I am counting on adding more as I continue this.
My objective is eventually exposing the money trail behind political marketing.
If you are a journalist interested in data, please get in touch, I'd like to work more on this.