Category: English

Lifting a blind roller in a DIY approach — just a planning post

The Problem: I have two massive blind rollers that are operated by hand. I would like to explore the possibility to lift them using a combination of electric motors and gears.

The main idea is to figure out how to achieve that with very small motors, and have a system that would work not too slowly to be completely useless.

Following the information provided in this post, the problem can be decomposed into two:

  1. The torque of the model. With the right gearboxes also a small motor could lift (slowly) a heavy weight, so that is something that needs to be explored
  2. The rate of which the motor can lift the weight.

While the first one is difficult to calculate, the main idea for the second can be calculated using the following formula:

\[ W= m g h \over t \]

Where W is the power required to lift a mass m of an height h in a time t.
It’s also possible to rewrite as the velocity v on which such mass gets lifted (because v = h/t):

\[ W = mgv \]

So if we use 10Kg as weight, to lift this at a speed of 1 m/s would require a power:

\[ W = 10 \times 9.81 \times 1 = 98.1W \]

This is just the beginning of the rabbit hole. I may post updates in the future if I will continue this. We will see.

Fun with Pyscript

Update: you can have only one py-script block for now. I removed the example one, and just left matplotlib example.

Pyscrypt has just been released and I feel I had to give it a go straight up from my blog.

The usage seems to be pretty straightforward:

  • you drop the link to the pyscrip.js in the HTML, and then you have the new <py-script> tag where you can write Python. Standard Python.
    For example the code below actually gets transformed into a string with the content Now you can!
<link rel="stylesheet" href="" />
<script defer src=""></script>

<div id="target"></div>
<py-script output="target"> 
from datetime import datetime
now =


which produce the following result!


from datetime import datetime now = #print("What time is it?") #print("Computed directly from Python: " + now.strftime("%m/%d/%Y, %H:%M:%S")) #print("Now you can use Python within the browser natively!")

Interestingly, you have the full Python arsenal at your disposal.

This code creates a matplotlib plot:

<link rel="stylesheet" href="">
<script defer src=""></script>
      packages = ["matplotlib", "numpy"]
<div id="mpl"></div>
<py-script output="mpl">
import matplotlib.pyplot as plt
import matplotlib.tri as tri
import numpy as np

# First create the x and y coordinates of the points.
n_angles = 36
n_radii = 8
min_radius = 0.25
radii = np.linspace(min_radius, 0.95, n_radii)

angles = np.linspace(0, 2 * np.pi, n_angles, endpoint=False)
angles = np.repeat(angles[..., np.newaxis], n_radii, axis=1)
angles[:, 1::2] += np.pi / n_angles

x = (radii * np.cos(angles)).flatten()
y = (radii * np.sin(angles)).flatten()
z = (np.cos(radii) * np.cos(3 * angles)).flatten()

# Create the Triangulation; no triangles so Delaunay triangulation created.
triang = tri.Triangulation(x, y)

# Mask off unwanted triangles.
                < min_radius)

fig1, ax1 = plt.subplots()
tpc = ax1.tripcolor(triang, z, shading='flat')
ax1.set_title('tripcolor of Delaunay triangulation, flat shading')


      packages = ["matplotlib", "numpy"]

import matplotlib.pyplot as plt import matplotlib.tri as tri import numpy as np # First create the x and y coordinates of the points. n_angles = 36 n_radii = 8 min_radius = 0.25 radii = np.linspace(min_radius, 0.95, n_radii) angles = np.linspace(0, 2 * np.pi, n_angles, endpoint=False) angles = np.repeat(angles[..., np.newaxis], n_radii, axis=1) angles[:, 1::2] += np.pi / n_angles x = (radii * np.cos(angles)).flatten() y = (radii * np.sin(angles)).flatten() z = (np.cos(radii) * np.cos(3 * angles)).flatten() # Create the Triangulation; no triangles so Delaunay triangulation created. triang = tri.Triangulation(x, y) # Mask off unwanted triangles. triang.set_mask(np.hypot(x[triang.triangles].mean(axis=1), y[triang.triangles].mean(axis=1)) < min_radius) fig1, ax1 = plt.subplots() ax1.set_aspect('equal') tpc = ax1.tripcolor(triang, z, shading='flat') fig1.colorbar(tpc) ax1.set_title('tripcolor of Delaunay triangulation, flat shading') display(fig1, target="graph-area", append=False)

Note: If you are trying to use the <py-script></py-script> on wordpress, using a Custom element block, you need to wrap the code with <pre></pre> otherwise wordpress will texturize the text, changing the characters to some more pleasing to the eye. However the HTML is actually executed by the py-script and it will not understand this link.

How to connect externally from a WSL (Ubuntu) running on a windows 10

I had to move to a Windows 10 computer for work, and even if I’m not working a lot on code and so forth, I need sometime to create an ipython notebook or try to launch some code I’ve got from github.

Having the ability to install the ubuntu distribution via the WSL2, directly form the Microsoft Store is a great help. The ability to use Visual Studio which connects to these “magic” area of the filesystem makes also the editing nice and easy.

Usually I like to go with the console, and I’m using the Windows Terminal, which is able to launch several environments in a native way.

However there is a little snag: you cannot contact the outside world or the internet, because by default the Windows firewall says no.

It’s easy to fix, thogh: Open the PowerShell as Administrator and launch:

`New-NetFirewallRule -DisplayName “WSL” -Direction Inbound -InterfaceAlias “vEthernet (WSL)” -Action Allow`

as shown below

Power Shell running as Admin

After that, you are golden. Unfortunately, for me, I have to re-run that every time I re-boot or come back from a Suspend. I would be very interesting to know if someone figure out how to do it in a permanent way.

Get the OEM key from your windows 10

So I’ve got this new computer with Windows 10 because Baldurs Gate 3 is coming out, and I did not have a tower for quite long time. The windows has been installed by the people where I’ve got it, and there is a second partition where I will install ubuntu. Unfortunately on the first go it did not work due to the NVIDIA card inside, (just a GTX 1060, not the new 30 series that everyone is going crazy about it), but the net has already a solution for it, which I will give it a try ASAP I’ve got time.

At the same time I’ve used it to play Heroes of the Storm and I’ve also tried my first stream on at . I stream HOTS in Italian, and it just a quick foray into that world because I’m a curious person.

I’ve used OBS to stream, and it seems everything was working quite well.

Back to the Ubuntu install, given the first try did not really work, I wanted to take a more cautious approach and save the windows key before I had to reinstall everything again, for safe-keeping.

To my surprise new rigs do not came with a sticker which says which is the windows key associated, but it’s written in the UEFI BIOS directly. When I’ve asked the support of the company where to get it, they told me this was attached only to that mother board, and if I was going to change the motherboard I had to get a new one.

While this makes no sense to me at all, given that I bought a personal license for windows, I was not very clear why I should not know the key of the license I bought.

Quick googling (via ecosia) and I found out this video on youtube which provides the solution in no time:

Open A PowerShell and type  (Get-WmiObject -query 'select * from SoftwareLicensingService').OA3xOriginalProductKey

Once I’ve got it, I’ve saved it on my clipperz.

HIH out there.

Have fun.


Opensourcing gardenio & friends

View this post on Instagram

Winter sunset are the best

A post shared by Michele Mattioni (@mattions) on

Some time ago, I’ve described what was the initial work done on the gardenio system to have smart plant monitoring in the house.

Today I’ve open-sourced the code. You can find gardenio on gitlab, and also the code for django website connected to it as well. At the moment the arduino bits have been disbanded, but it’s handy to have it out there as a possible blueprint for projects of this type.

The whole system was run using ROS1 and it’s handy to have it open to be used as an example for the next work that will go towards the Dimitra project.

BTW, if you are interested in collaborating on agritech solutions, which involves automatic robotic system applied to urban and small scale agriculture, just give me a shout.

If you were curios, that’s what the Dimitra project is all about.

Some little changes

View this post on Instagram

Dawn of a new day

A post shared by Michele Mattioni (@mattions) on

Since my old post regarding the smart plants with Arduino and the Rasperry Pi experiment some things have changed, and I think it would be good to write about it.

First of all we have moved country! We left London and the UK, being part of the secret Brexodus tribe which does not get talked about it too much by the media, and we moved to Ancona in Italy. If you wondering, the Brexodus tribe is all the people that decided to leave Brexit land for safer/better places; given the fair amount of people, usually with diverse jobs and across the board, we have renamed this internally the Brexodus, the Exodus from Brexit. We do not of course know each other (unless in a very small amount of groups, but this is a clear phenomenon which will become more clear going forward).

Several reasons did contribute to this decision. We literally moved just before the “original” Brexit deadline was supposed to arrive (the 29th of March), and we actually took off for good on the 21st of March, landing in Italy the same day (of course :)).

We could have gone for settled status in UK maybe, but you never know with the UK Home Office and the UK Government what they will do. To be fairly honest we never really investigated too much what we had to do. We didn’t feel like London was anymore the best place to live. More over our rights have been stripped away in one night via a referendum on which we couldn’t even vote, and which has basically changed everything. Therefore why waste time on that?

Honestly, given the latest news, I have no idea what the UK will do. However we know what Earth will do 😀


When I started this post, I wanted to write about three things not really related:

  1. the move, which I wrote above
  2. some changes on the ads shown here, which made me think about the move
  3. some robotics news I’ve read today that I found interesting (related to the last blog post I wrote, that’s why they pop up)

So here we go with the last two.

Regarding the ads, I used to have an AdSense ad on the sidebar. I’ve set it up long time ago, and if you are interested why, I’ve written in this very old post. Changing country I had to close it down, and I do not feel it would make sense to bring it back, therefore, from today, the Sidebar does not show ads anymore. Given that I run the blog myself on a dedicated server, no more ads on this pages will be shown for the time being.

On the robotics side, as I have written before on the plants post, I’m looking into building an automatic robots for agriculture with wheels, and I’m interested in ROS2 development to do that part. Today I’ve discovered that Acutronics has closed down. I was really interested by the work they were doing, and the arm that they have developed was ROS2 compatible. It seems they did not manage to get enough funding to go ahead. A pity, they were doing a really great job.

That’s all for now.
Have fun!

Smart Plants via Arduino and Rasberry Pi

View this post on Instagram

Smart plant #arduino #iot

A post shared by Michele Mattioni (@mattions) on

I always looked at electronics in wonder, and I always liked circuits. I remember during my high school I have created a project with four light bulbs, powered by a battery: two in series and two parallel to demonstrate how the electricity was splitting between the two. I found it fascinating, but I never managed  to get back into it.

The itch to scratch

I have four plants on the window of my kitchen at the moment: a (now dead :'( ) rosemary, a parsley, an orchid and  a basil. I never knew when to water them. I never knew if I gave too much water, or too little. On top of that, different plants have different needs, so you have to keep track of it and can’t just water equally.

In the past I have discovered that some of the plants were drowning in water, which was collected in the bottom of the pot, making the roots mostly mouldy and in the end causing the plant to die.

On top of understanding the water needs, I wanted to create an automatic water plant system, which would keep the plant well hydrated automatically, while I could follow what was happening from a web interface.

At the same time, I wanted to explore the robotics world, and to have a project where I could use the Robotic Operative System ROS.

Components to create Gardu

After a bit of thinking, I’ve created a Gardu, which is an automatic way to keep track if the soil irrigation, and having the possibility to follow it from afar.

These are the different pieces of the puzzle:

  • ROS based Arduino/RasberryPi (Gardu):
    • 4 soil sensors
    • a 12 volt pump
    • a servo
  • Django powered website (Gardenio) to track the readings and associate the plant

This is the flow:

  1. Gardu acquires analogic read of soil irrigation from a sensor
  2. Gardu makes a post request to the Gardenio website with a unique identifier (which is unique to each Gardu, and it is embedded in the firmware code)
  3. Gardenio stores the value for that sensor, finds the associated plant, and responds with the threshold of minimum soil irrigation value for that plant back to the Gardu.
  4. Gardu process the value: if the read is above the threshold, then Gardu sends the command to water that plant

So far I’ve managed to finish the soil sensor acquisition, while I have an early prototype for the automatic watering.

This summer, while I was away, I could track my plants getting thirsty from far away:

That’s when the rosemary died, eheh.

I’m still sorting out the water pump, due to the problem with the wiring on the circuit. This is the current circuit for now (the servo is not here).

Circuit for the water pump system (missing servo)

The soil sensors part works like a charm and they gave me no real hard time, the servo/pump combination instead it’s a little more complicated to handle.

They need external power, which is provided by the battery, but because they are connected on the same circuit, they do not work reliably at the moment. It seems that a possibile solution would be to use a UBEC.

Once that is done, the servo gets hooked up to a custom designed 3D printed valve that looks like this:

The inner red part rotates, do the top hole in the green part is in axis with the one in the red part, which then is in axis with the blue tube which will be connected with each plant. The idea is that only minimal amount of water will go towards the other plants, and most of it will go to the target plant. You can see it on OnShape here.

After some quick testing, I have a strong feeling that this is not gonna work, and the design may have to change.

So, given the automatic water system is still a work in progress, this time I will also be able to track the thirst of the plants from far away.

For now we are looking ok:

Current status of the plants. Live data at

The race is on: will I be back before the plants will go completely dry?How many will I loose this time? Future will tell.

Running wordpress on HTTPS with dokku and let’s encrypt

An nice (unrelated) pic to start 🙂

After dusk pic are pretty.

A post shared by Michele Mattioni (@mattions) on

So the big question… Is your site running HTTPS? If no, you should, if yes well done!


This personal blog has been moved from into a personal hosting powered by dokku long time ago. While this has been proved to be pretty nice, I honestly think that wordpress has reached a maturity level, where you can just get away running the software on a PHP powered website (like an Apache with the PHP module or whatever), switch on the automatic updates, and be happy with that.

Basically, install once, and then forget about it.

So while doing the upgrade manually via git it’s not a big issue (here is the little README I wrote to remind myself of the procedure), you still have to do the upgrade once in a while.

All this was always feeling a bit as wasted effort, until I decided that I should have moved the site into HTTPS.

Getting these pesky HTTPS certificates

In few words: HTTPS encrypts the traffic that goes from your web browser to the server that handles the request; to perform the encryption a certificate, provided by the server, it is used. The legitimacy of the certificate is provided by a root certificate authority. While you could issue a certificate by yourself, which would be valid and be perfectly valid from a technical standpoint, your site would still be marked not secure. The catch is that Firefox/Chrome and the other web browsers do come with a list of “root authorities” that they recognise as legit. Therefore, your technically valid HTTPS certificate is not connected to any of this “root” authorities, and therefore not recognised by the web browsers.

For quite he only way to get a HTTPS was to buy one from a seller, who will be able to issue one, connected with their root authority. This usually was either for a certain domain, or a wildcard for all the sub-domains and it was costing around 20/30£ per year, depending on the seller. Note that the validity was usually for a year, therefor you had to manually get a new certificate, and do the reinstall, which usually meant run some commands, and then add it to nginx or apache to be able to serve from https.

So this process was pretty labour intensive, it was costly and most importantly, when the certificate was going to expire, the website was going to be “untrusted” the red bar with the broken lock in the address bar was showing up.

Let’s Encrypt to the rescue

Let’s encrypt it’s a root certificate authority that is able to provide HTTPS certificate. Their goal is to make the internet safer and more secure, therefore they provide the HTTPS certificate for free. More over, while they do not charge for the certificate, they also provide a way to programatically get a certificate and renewed it, in a very easy and straightforward way.

What’s super nice, it’s that dokku has a very nice plugin able to make all the process automatic for the user.

In my case, given that I have already wordpress deployed via dokku I just had to run the following command:

$ dokku config:set --no-restart myapp DOKKU_LETSENCRYPT_EMAIL=your@email.tld

This creates the certificate.

Then you have to encrypt your app. It’s just one more command:

dokku letsencrypt myapp

This one sets up the nginx configuration to re-directs the request to the https site for myapp.

It basically worked like a charm on the first go.

There is also the nifty command

$ dokku letsencrypt:cron-job --add

which will create a cronjob for dokku user to re-fetch and renew the HTTPS certificate in an automatic way.

Pretty neat, and now totally worth it to deploy the wordpress via dokku.

Just a short trim…

Usually in this blog, especially lately, I stick to technology and related topics, however this episode was so funny for me that I’ve decided to write about it even if it is a page from my personal life.

Have fun!

Dog GIF - Find & Share on GIPHY

I left the house this morning with the idea of getting a small hairscut, more a trim than anything else.

Before I start going into the details, let me tell you how complicated I am about cutting my hair. I have curly hair. I used to have short hair when I was basically 15, and then I start having very long hair. Because my hair is curly, the more it grows, the more it gets curly, masking the real length. Anyway, at some point it gets unruly, so you have to cut it back.

I used to go to the same barber for years. He is based in my parents’ village and I tend to go there to have a haircut when I’m back in Italy. Half of my friends think that is totally insane to have a barber that is more than 4000 Km away from where you currently live; the other half instead they completely agree, and they would not change their barber/haridresser ever. Nobody else could even grasp how to do your hair.

Let’s dive in, then..

I was walking on Stroud Green, which has an hairdresser/barber every 3 shops, and I was toying with the idea to have a small trim, so I could have a decent situation until December, where, given I was going back to my parents place for Christmas, I could have the proper cut(tm).

I decided to enter a barber shop. My belowed reader,  this is going to get interesting real fast :D.

I enter the barber shop, and ask if there is a space. One guy is already working on another customer but he tells me there is space: the chair is free and another guy is basically getting ready to start working. (It’s very early in the morning).

My barber is called Ozi (I think..). He asked me how I want my hair cut. I tell him I want just a trim, to have a more precise and ordered head.

He shows how much he understood he has to cut, using his hand to show the length. I tell him I would like them a bit longer. He gets it the other way around, and he thinks I want longer the part that he has to cut!

After some other brisk exchanges, it seems an agreement is reached.

He starts using some water spray to wet my hair. I am a bit confused by that.. I thought he was going to wash them so I ask him:

Me: “No wash?”

Him: “No wash!”

Me: “No wash??”

Him: “No wash.”

Me: “Maybe wash?”

Him: “Wash?”

Me: “Yes, No?”

He puts the bottle down, opens the tap in the sink in front of me, and then he basically drops me in the sink. I even didn’t realize what is happening that basically I’m making bubble in the sink trying not to drown, when this guys is washing my hair in a very energetic way.

Close to my breath running out, the water stops, and I came back to breath again. Ozi takes a towel and starts to dry my hair, using a decent amount of force, because of which I honestly worry about my neck muscles.

He starts to cut my hair, and goes on for a good bit. He asks me if I want a trim on the side and I agree to it. I start being suspicious that he is cutting quite a bit, so I try to understand how much he cut (when my hair is wet, basically it’s quite difficult to assess the length.)

I look at my hair.. and I say:

“This is very short! You cut a lot!”

He replies: “sorry boss!” and he continues to cut my hair!

Oh well, the situation went out of control, so I decide that I can’t really do anything about it, and accept my look change.

The beard business

He asks me if I’m interested in him cutting my beard, and I accept the offer. He asks me if I’m interested to let it growth, and I say yes. So he asks me which settings should he use and I propose a 7…

He looks at me extremely puzzled. More over one of his collegues comes over as well, he test the lenght of my beard with his hand and he says, “I’m sorry boss, but this is 2, max 3. I think our machines are bigger”.

I looked at them, I take a quick peak at their machine and I agree that could be the case, so we settle on 3.

The machine just gently trims very tiny amount of the bead. When he is happy with the result, he picks a serious hard-core brush and he brushes my beard. I have to confess that I never done that before. I’m amused.

After that he asks me where I want the neck line. I never did one before, so I just tell him to go ahead, and he makes the line using a cutthroath razor. He cuts me in a little part, he picks some septics and then he applies it to the wound. It’s a real close shave.

At this point I’m quite lost, I do not know what to expect next. Ozi takes some kind of mask, and it applies on my face. I’m sporting a mask. First time. Ever…

However the surprises are no finished. Ozi takes a small little piece of metal with some cotton at the end, that looks like a little torch. He switches it on with a lighter and we have an honest flame in the shop, just in front of me.

He blows it away and then he uses the hot metal to burn the hairs on the outside of my ears! First time again that something like this happens.

He takes some small scissors and he also cuts the hairs in my nose; I’m still wearing a mask, and it is clear that I’m completely lost to the unfolding of the events, and I’m watching the happening with a sort of spectator detachment at this point, and decent amount of bewilderment.

Before I can truly realize what is happening, I’m again in the sink, where he washes my hair (again) and also the mask cream from my face. This time I’m a bit better at catching some breaths here and there, so the drowning possibility is a bit less concrete.

The wash concludes and I get back, sitting on the chair again. At this point a hot towel shows up. Ozi wraps it around my face, leaving a tiny hole, so I can breath through my nose.

While I am there thinking that is nice to relax like that, out of nowhere my right arm gets lifted and quickly rested upon a support integrated with the chair that just slided out of nowhere. Ozi is massaging my arm!

I’m like: What is happening here today? I do not really know. Ozi goes on to the other arm and I get also that one massaged.

The not so hot anymore towel gets removed, and Ozi takes some container. I think it may contain wax, and he is gonna put it on my hair. I think he may just do it without asking me, however I also recognize that this whole business is out of my control for quite some time, so I’ve made peace with my destiny and I’m waiting for the wax.

Of cours it is not! It’s cream! Not a mask cream, but some moisturizing cream that he puts on my face.

He then asks me if I want some gel. I say no to the gel. It’s time to make some choices :D.

It’s the end. I stand and I look at myself. Basically all my hair has been cut, my beard is phenomenally shining, and I look so different from when I’ve entered!

I pay, and just on the door, the other guy tells me goodbye sexy!

So here it is. I hope you enjoyed the story, I truly had fun during it.

As usual, Merry Xmas and Happy New Year! I hope you enjoy the decorations :D 




Machine learning empowered blackkiwi

balckkiwi og

A blackkiwi which tells your mood.

Blackkiwi is a django powered website which uses machine learning to tell if you wrote happy or sad things on your latest Facebook status. And it tends to be wayyy positive. But we will get there…

The genesis of blackkiwi

There were two main things combined with the genesis of blackkiwi.

The first it was this curiosity about Natural Text processing and classificaiton techniques. In particular I wanted to write some classifers to see how well they were performing and I also wanted to try to do and test something new.

But I needed some kind of application. This is usually a good trick in programming in general. If you build towards something, it is always easier to stay motivated and actually getting it done, instead of giving up the hobby and end up playing World of tanks on the play :).

The second ingredient was to try to test the release process via gitlab, using automatic push via CI to a server. As stack I wanted to use a classic dokku stack which I’m very happy to use, beccause it basically brings the nice and easy way to deploy similar to heroku/gondor style to your own server.

Last but not least, I wanted to test natural language processing, because I wanted to do something about my facebook feed. Lately, giving maybe to all the political happenings like Brexit, Trump, migration crisis I saw an increase of posts from people being extremely racists, hate-fulled and extremely violent. This toegether with total non-sense and antiscientific claims.

The classic way would be to try to have a conversantion, and try to explain that these positions are unacceptable and also dangerous for the whole community, but this usually ends up in a fight with the trolls, and TBH, I don’t think it is a winnable fight.

However I thought it could be a good idea to try to get something going, where you can get the facebook status and see where you were basically landing. Were your statements close to for example racist individual, or you were more close to intelligent and inspiring characters?

Of course this is quite complicated to build, but I decided that I have to start somewhere, so I settled on an application able to tell if you were happy or sad on Facabook to start.

Blackkiwi: how does it work?

Conceptual there are three main parts:

  1. the kiwi goes to Facebook to get the user’s mood after being authorized (it is a good kiwi)
  2. the kiwi works very hard to try to understand if you were happy or not, and it writes it down
  3. the kiwi then draws this moods on a plot, to show your mood in a timeseries fashion ways.

It’s a pretty clever and hardworking kiwi, our own. I’m not sure what should be the name. feel free to propose one in the comment, if you like.

The computation stack: the classifiers

Two problems needed to be solved here:

  1. we needed a way to connect to facebook and get the moods out in same form, so we could feed them to the classifiers
  2. we had to build, train and then load the classifiers

The first part of the job was quite a new adventure. I never used Facebook Graph Api or created an app on that platform before, so there was a little bit of learning. At then end of several experimentations I’ve settled to use facebook-sdk. Nice piece of software which does most of the job.

For example, our collector class looks like this:

# -*- coding: utf-8 -*-
import logging
import argparse

import facebook
import requests

# create logger
logger = logging.getLogger(__name__)

class FBCollector(object):
 def __init__(self, access_token, user):
 self.graph = facebook.GraphAPI(access_token)
 self.profile = self.graph.get_object(user)
 logger.debug("Collector initialized")

 def collect_all_messages(self, required_length=50):
 """Collect the data from Facebook
 Returns a list of dictionary.
 Each item is of the form:
 {'message': '<message text here>', 
 'created_time': '2016-11-12T22:59:25+0000', 
 'id': '10153812625140426_10153855125395426'}
 The `id` is a facebook `id` and it is always the same.
 :return: collected_data, a list of dictionary with keys: `message`, `created_time` and `id`
 logger.debug("Message collection start.")
 collected_data = []
 request = self.graph.get_connections(self.profile['id'], 'posts')
 while len(collected_data) < required_length:
 data = request['data']
 # going next page
 logger.debug("Collected so far: {0} messages. Going to next page...".format(len(collected_data)))
 request = requests.get(request['paging']['next']).json()
 except KeyError:
 logger.debug("No more pages. Collection finished.")
 # When there are no more pages (['paging']['next']), break from the
 # loop and end the script.

 return collected_data

if __name__ == "__main__":
 # create console handler and set level to debug
 ch = logging.StreamHandler()
 # create formatter
 formatter = logging.Formatter('%(asctime)s|%(name)s:%(lineno)d|%(levelname)s - %(message)s')
 # add formatter to ch
 # add ch to logger
 parser = argparse.ArgumentParser(description='Process some integers.')
 parser.add_argument('access_token', help='You need a temporary access token. Get one from')
 parser.add_argument('--user', help="user with public message you want to parse", default="BillGates")
 args = parser.parse_args()
 fb_collector = FBCollector(args.access_token, args.user)
 messages = fb_collector.collect_all_messages()"Collected corpus with {0} messages".format(len(messages)))

As you can see you need a token to collect the message. This token is obtained by the profile of the facebook user, which will let you collect his/her status. note that you need permissions to do this for real, and your app needs to be approved by Facebook, however you can get the messages of a public user, like Bill Gates in the example, and then get them out in a nice organized list of dictionaries.

So have a way to connect to Facebook, and given we have the right token ™, we can get the status updates out. We’ve got to classify them now…

May the 4th has passed

The classifiers bit is quite complex. First we need to find a corpus, then we need to create the classifiers, then to train them. Then save them, so we can then load them up and use them.

We build the classifiers using the nice NLTK library, together with Scikit-Learn. All the classifiers perform pretty similar, and I decided to go for a voted classifiers, which decided if the text is positive or negative using the majority consensus. Instead of using pickle to save them, we are using dill, ‘caue it plays well with classes.

Once they have been trained, we can load them up and use them. This is the loading function:

def load_classifier(self):
    naive_bayes_classifier = dill.load(open(self.naive_classifier_filename, "rb"))
    MNB_classifier = dill.load(open(self.multinomialNB_filename, "rb"))
    BernoulliNB_classifier = dill.load(open(self.bernoulli_filename, "rb"))
    LogisticRegression_classifier = dill.load(open(self.logistic_regression_filename, "rb"))
    SGDClassifier_classifier = dill.load(open(self.sgd_filename, "rb"))
    LinearSVC_classifier = dill.load(open(self.linear_svc_filename, "rb"))
    NuSVC_classifier = dill.load(open(self.nu_svc_filename, "rb"))

    voted_classifier = VoteClassifier(naive_bayes_classifier,
    self.voted_classifier = voted_classifier
    self.word_features = dill.load(open(self.word_features_filename, "rb"))"Classifiers loaded and ready to use.")

and the analyzer API looks like this:

analyzer = Analyzer()
classified, confidence = analyzer.analyze_text("today is a good day! :)")

The computation stack: the web

django meme

Yep. Django. Always. 🙂

These are the installed app in the blackkiwi project

    # our stuff
    'moody', # we are first so our templates get picked first instead of allauth

All the integration with Facebook is happily handled by the django-allauth which works pretty well, and I suggest you to take a look.

For example, in this case I wanted to override the templates already provided by the django-alluth and I have put our app moody before allauth, so our own templates do get found and picked up by the template loaders before the allauth proided.

So that way, once the user authorize us, we can pick the right ™ token, collect his/her messages, and then score them with the classifiers.

Then we plot them on the site using D3.js, like you can see here.

The deploy is done using gitlab, with testing/staging/production system, using the gitlab CI. But we leave this for another post, ’cause this is way too long anyway.

Have fun!