Page 3 of 60

How to transfer files locally

file transfer

Transfer files is still hard in 2015. And slow

When I have decided to go on with my “new computer”, I had the classic problem: transfer all my data from the old computer to the new one.

So I’ve installed a SSH server on my old computer, both computers where connected on the same wireless network, therefore I have launched an rsync to copy recursively all my home from my old laptop to the new one. Not so fast, cowboy!

Unfortunately this did not work as expected, for a series of reasons:

  • the packets were continuously dropped by the router: it seems the route to host was not available at certain time, with rsync stalling
  • re-launching the command was overwriting all the files on my home directory, however my old computer was running a 12.04 LTS, while this one is on a 14.04, hence every time a program was upgrading some of the preference, it was overwritten by rsync. And then, as soon the program was launched, the files were changed again.

So I needed a different approach.

I plugged the two computers, created two wired connections, given two diffent Ips to the two computer, and used that to do the transfer (once I’ve switched off the wireless.) Win!!

Details how to do it are on this Stackoverflow answer, so I won’t repeat them here.

Go ahead and transfer your files fast!

Running Ubuntu on an old Mac Book Pro

ubuntu_on_mac

Ubuntu on a Mac, because of reasons

Prolugue

My computer served me well up to now, however it started to show it’s age. It is an old DELL XPS 13 inches, from 2008. The .. emh, beast, has one CPU, virtualized to two and a whopping 4 Gb of RAM (which is not bad). However the right hinge cracked not too long ago, and it was becoming too slow when trying to play with new toys (docker, I’m looking at you).

This fact, together that the operating temperature was always around 80/90 degrees, had transform it on a very cumbersome machine to work on. Maybe with an external monitor/keyboard and mouse could have improved the situation a little, however I tend to work always on the go, and I do not have a proper working desk, mostly working in the kitchen.

Therefore I’ve started to look in the market for a replacement. The problem was I want a unicorn laptop, which does not exist due to physics’ laws.

Let me explain: I’ve wanted the latest badass videocard from NVIDIA, the craziest Intel CPU and the laptop should also run both quietly and cool.

Now this sounds pretty insane specs, and they are. There are laptop out there, especially the gaming ones, that have super powerful CPUs and video card, however they run pretty hot, and the fans are always on, making them quite noise and not exactly a pleasure to work with.

On top of this the battery also gets a beating, due to the extra power used by these hungry appliances.

Therefore nothing was fitting the bill.

Ubuntu on Mac Book Pro

Given the fact I was not really falling in love with any computer out there, I start to look for other solutions. It turned out that my girlfried had an old Mac Book Pro (2010) that she was not using any more, because she upgraded to a mac book air a bunch of years ago.

This left a decently powerful computer (4 virtualized CPUs, 8 Gb of RAM) available, so I’ve decided to test drive it.

There was never an option to use OSX because: a) I’m not a fan, b) I really like Linux and the freedom that comes with it, c) it’s impossible to get it to do what you need to do it.

I guess also, having used Linux for more than 15 years as my primary system, and loving it for development and data science, has influenced my view on this.

Few constraints I had before attacking the problem:

  1. the OSX partition must remain and still be useful. Was not a must but a nice to have thing
  2. The data attached to that OS should be conserved, and having access to them via the OSX would be the best solution

We are going for dual boot (oh yeah!!!), without blowing up the whole disk (oh double yeah!!!).

This laptop sports a 500 Gb harddrive, and the total amount of space used by the OSX plus the data is around 150 GB, hence I was able to cut more or less 350 GB to use for Ubuntu as I pleased.

How I did it:

  • I’ve installed rEFIt on it to manage the dual boot (yep, it’s been abandoned, but it does the job)
  • I’ve installed Ubuntu 14.04 from a live CD (you need to press the Alt Left or also known as Option key to get the machine to read the CD at boot)
  • I’ve tried to use the Nvidia driver, failed badly, given up and sticked with the opensource nouveau, which do a pretty decent job.

For the partition I’ve followed the same strategy I’ve written about here

Impressions so far

  • The computer is quieter than my previous one, and it’s a lot snappier. Tha fan do work, once you have installed the macfanctld package
  • The keyboard is nice, however I’m not used to the layout of the Command key and the Alt key. So I may look into re-mapping them.
  • The Fn button together with the Function key (F1-F12) are inverted. This means that to press F4 you have actually to press (wrong) Alt-Fn-F4. Basically I’m better off to click close with the mouse.
  • The right click can be obtained touching the trackpad with two fingers at the same time, ’cause we’ve got only one button.
  • The layout picked does not really match the keyboard anyway, but I do not care, ’cause I’m using my muscle memory, and I’m doing just fine (the is where there is the “, for example).

So I’ve gained in speed and got a slightly odd keyboard and right click, which I need to relearn (no other option for the right key), and maybe re-map for the keys. I guess if I have used this layout all the time, these were not an issue.

I’ll keep it, for now.

Upgrade to git 2.5.0 on Ubuntu

Git-Logo

Git, kickass distributed VCS

Running an Ubuntu 12.04 on a 2008 laptop (which I should change, but didn’t find a decent replacement yet) means that sometimes I’m not getting the latest software available by default.
For example, I just figure out that my git version was 1.9.1, while the latest available, ATOW, is 2.5.0, so I have decided to upgrade.

Conveniently there is a ppa that provides the latest and the greatest version of git for Ubuntu. Enable it and you will have the latest git running on your system:

sudo add-apt-repository ppa:git-core/ppa
sudo apt-get update
sudo apt-get install git

Check you have the latest

$ git --version
git version 2.5.0

Now the big question: Is upgrading really necessary? I do not really know. However being a person that creates software, I can tell you that running the most up-to-date system and software is, in most cases, a good idea.

Pandas best kept secret

Pandas by default limits the number of printing character to 80. This is good and basically suits most of the use-cases. However, sometimes, you have either very long names for your dataframe columns, or you have lots of columns, which results on having your dataframe splitted in several lines, while half of our big screen it’s not used at all. That’s quite a lot of real estate thrown away for not good reason and it makes a tad more complicated to read it.

You can live with it, and all will be fine, or you can change it!
I always knew that there was an easy way to change this and avoid to have the line being splitted on the new line. Today I’ve researched and found it, and I am sharing here to remember it!

Just import the module and bump the width to 180 (default is 80)

import pandas as pd
pd.options.display.width = 180

 

The result is pretty cool.

Consider this dataframe:

import pandas as pd
df = pd.DataFrame({"very_long_column_name_for_example_purposes1" : [1,2,3], "very_long_column_name_for_example_purposes2": [4,5,6], "very_long_column_name_for_example_purposes3": [1,2,3]})

You can go from this:

pandas dataframe 80 width

pandas dataframe 80 width

 

To this (click on it to see it in full size):

Pandas dataframe with 180 width

Pandas dataframe with 180 width

Upgrading dokku to 0.3.22: some gotchas

but than I write about it

but then I write about it

I’ve upgraded dokku to the latest master release, to make sure I was running the latest version.

The reason for the upgrade was that I wanted to install supervisord plugin, so when I have to reboot my server due to an upgrade, all my application will come back to life automatically.

After the upgrade of dokku, all my container where down, so I’ve launched the command to rebuild all of them:

dokku ps:rebuildall

Unfortunately this didn’t work as expected.

My web containers (running three apps: django/python, flask/python, wordpress/php) got deployed as expected, instead my databases did not come back to life.

The two plugins I am using to run my databases are: dokku-pg-plugin and dokku-md-plugin.

While both plugins do not offer a clear way to restart the databases containers, I think I found out a way that worked for me as a workaround. It’s different for each plugin.

For the mariadb you have to fake to re-create the database, which will use your old database container and just re-attach to it.

 
dokku mariadb:create <olddbname>

For the postgresql instead, you have to re-link the old database:

dokku postgresql:link <myapp> <mydb>

Each of this command should trigger an instant redeploy, and your application should be back online.

One thing to know: if you stop a command execution with a Control-C, you may leave your application in a blocked state. If you run a rebuild or any other command, you may found out saying “Error your application is locked”. To get rid of that go on your server and blow away the /home/dokku/app_name/.build.lock file.

Watch out: the name of the file and/or error could be different, I just recall from memory.

Switch private repo from github to gitlab

Hello gitlab

Hello gitlab

Github: the good part

Github is awesome for opensource software. The collaboration, the audience, and the integration offered right now (July 2015) is very good.

You want your opensource projects to be on github, because of SEO and the ability to have them found. The several features offered, like the documentation integration, the Pull Request and so forth are just too good. I have got several projects there, and you can browse them here.

Github: the expensive part

However, if you are looking to host there also your private repo, it’s when github is not any more what you are looking for.

The major problem they have is their price structure. The micro plan, is 7$ for 5 repos, and than it’s 12$ for 10. It gets expensive very quickly.

Until today I used to pay for a micro plan. However yesterday I’ve started another project, I have created a repo for it, and than I wanted to push it online in private mode. But it was my six repos. Either I was going to opensource it, or I had to increase my plan from 7$ to 12$.

All these repos are from personal project, that I may not develop anymore, which however I don’t want to opensource and I cannot archive either. The number of collaborators on these repo is either 0 or 1 at most. I think if they were offering unlimited private repository, with small number of collaborators I could have considered to stick with them for my private repo.

Not an option, so I had a look around.

Looking for alternative: Bitbucket or Gitlab?

The big competitor of github is of course bitbucket. Back in the days bitbucket was supporting only mercurial, but than they also integrated the support for git. So you could put your project there, and than be happy. Their pricing structure just count the number of collaborators in a project, so in my case I can have all my repos with the free account.

However, it’s a bit of time that we use at work a self-hosted gitlab , which it served me pretty well so far, and I love the slick integration with the GitlabCI.

GitLab is very similar to github, and offers similar features: once you know that Pull Request are called Merge request, you’re golden.

The cool thing is there is an hosted version, where you have as many as you want private and public repos.

At the end I decided to got for gitlab, due to the integration with the Gitlab CI, which will give me the ability to run tests for all my private repositories, given the fact I provide a runner.

Of course all my opensource repo swill stay on github, and in case I will opensource some project I will just migrate them on github.

As I said, If there was an Indie developer price point, (unlimited private repos with small number of collaborators for 7$), I was going to stay on github and be happy with that, however given the circumstances and the automatic integration with the CI, Gitlab is my choice for now.

Handpicked wordpress plugins for your self-hosted wordpress blog

A nice pic of a boat on a lake. Not really relevant with the post

A nice pic of a boat on a lake. Not really relevant with the post, but still pleasing

Intro

With the recent move from wordpress.com to a self hosted wordpress.org blog, I had the possibilities to pick some plugins that really have helped me to set-up and make the blog a tad bit more customized and close to my needs.

I’m gonna list them here, with a small description so it may be handy for someone else that is researching the subject as well

The handpicked Plugins

  • Worpress Importer This plugins lets you import your old wordpress.com blog in the new wordpress installation. Make sure you have `import everything` selected when you do the import, so all the images and attachments are happily downloaded and imported as well.You have to make sure you have increased the max upload file on you server, if you are importing a very big file. This is done changing you php configuration and the limit of max upload either in nginx or apache, depending what you are using. More info how to do this here.
  • Next plugin you want to get is JetPack from wordpress. This plugin has a lot of feautures that you can activate as you see fit. My favourite onese are: publicize (automatic sharing on G+, Facebook and Twitter), Moitor, that keeps an eye if your site goes offline and Photon, to serve images quicker from their CDN.
  • Spam is always a bad thing, and BruteProtect is a way to pretect yourself from it. You just activate it, and it is going to do is job.
  • Once you move to self-hosted blog, you have to manage also the backup for your site.
    A very handy plugin is UpdraftPlus – Backup/Restore, which gives you the ability to:

    1. Make automatic backup of your blog, including database, images, themes and plugins
    2. Upload you backup to a third party service, like for example DropBox
    3. Configure a schedule for your backup, with also a number of old backups you want to keep. My pick was 10 backups, with a weekly schedule.
    4. Restore your old backups with a single click.

    It’s very well designed and it works like a charm. Totally recommended

  • To make sure you write to the point, and keep your post interesting also for search engines, Worpress SEO is a good candidate. Although the title parser looks only for one keyword, so there will be always a disagreement between the plugin and a sane title, it’s extremely handy to keep sitemap up to date and automatically signal google when a new post pop up. Handy tool.
  • Due to the amount of code I tend to post, a nice way to present it, with proper highlighting it’s useful to have. For this I’ve picked Enlighter – Customizable Syntax Highlighter, which does a very good work, comes with themes to nicely integrate with the current palette of your site.
  • Last but not least, the Disqus Comment System is a nice and, according to me, superior way to enable comments on your post. It offers an import function to transfer all your old wordpress.com comments on the disqus system, and it’s pretty nice thing to have.

So there it is, some of the plugins I’m using on this website, which you may, or may not, find useful for your own site.

 

 

Dokku Environment variable special characters

Note to self: with dokku 0.3.1.7 special bash character in environment variable do not get parsed properly, and they break everything. Do not use them.

For example:

# this will fail
BLAH=!qwekcxzmpeqd('}

# this won't
BLAH=ewq989u0caad909ad

The amount of randomness involved is smaller. Most likely upgrading to 0.3.18 could fix this.

It took me 2 hours to figure this out and my push to my dokku server where denied.

Clean up old kernels

Do you want to save 15 Gb of space?

I have an old laptop from 2008, which is running ubuntu. Everytime a new kernel is released, this gets installed, however the older kernels and their image do remain available and they do not get automatically uninstalled. I guess this is a security feature, however if the installation is done only once, and years and years of new kernels are stacked, the space taken may start to be excessive and also big enough that reclaim it is a good idea.

Starting point

As you can see from below, my / partition was quite full

mattions@triton:~$ df -h
Filesystem               Size  Used Avail Use% Mounted on
/dev/sda1                 46G   43G  1.3G  98% /
udev                     2.0G  4.0K  2.0G   1% /dev
tmpfs                    396M  1.1M  395M   1% /run
none                     5.0M  8.0K  5.0M   1% /run/lock
none                     2.0G  260K  2.0G   1% /run/shm
cgroup                   2.0G     0  2.0G   0% /sys/fs/cgroup
/dev/sda6                176G  166G  1.7G 100% /home
/home/mattions/.Private  176G  166G  1.7G 100% /home/mattions

The images stored in the boot partition were also taking quite a bit of space:

mattions@triton:~$ du /boot -sh
2.3G /boot

Purge the old kernels

To purge the old kernels, you can either do it by hand via ubuntu software center, or by synaptic, or use a script to do it for you. After a bit of googling, I’ve discovered the following script, aptly named purge-old-kernels, which I have also uploaded as a gist on git, just not to loose. Feel free to download it and use it if you want

When I ran the script, this is the list of the kernels that will be eliminated in my case (yours maybe will differ):


mattions@triton:~$ sudo bash Desktop/purge-old-kernels.sh Reading package lists... Done Building dependency tree Reading state information... Done The following packages were automatically installed and are no longer required: linux-headers-3.13.0-32 linux-headers-3.13.0-34 linux-headers-3.13.0-35 linux-headers-3.13.0-43 linux-headers-3.13.0-39 linux-headers-3.13.0-46 linux-headers-3.13.0-49 linux-headers-3.11.0-17 linux-headers-3.11.0-18 Use 'apt-get autoremove' to remove them. The following packages will be REMOVED linux-generic-lts-saucy* linux-headers-3.11.0-17-generic* linux-headers-3.11.0-18-generic* linux-headers-3.11.0-19-generic* linux-headers-3.11.0-20-generic* linux-headers-3.11.0-22-generic* linux-headers-3.11.0-23-generic* linux-headers-3.11.0-24-generic* linux-headers-3.11.0-26-generic* linux-headers-3.13.0-32-generic* linux-headers-3.13.0-33-generic* linux-headers-3.13.0-34-generic* linux-headers-3.13.0-35-generic* linux-headers-3.13.0-36-generic* linux-headers-3.13.0-37-generic* linux-headers-3.13.0-39-generic* linux-headers-3.13.0-43-generic* linux-headers-3.13.0-44-generic* linux-headers-3.13.0-45-generic* linux-headers-3.13.0-46-generic* linux-headers-3.13.0-48-generic* linux-headers-3.13.0-49-generic* linux-headers-3.13.0-51-generic* linux-headers-3.13.0-52-generic* linux-headers-3.2.0-23-generic* linux-headers-3.2.0-24-generic* linux-headers-3.2.0-25-generic* linux-headers-3.2.0-26-generic* linux-headers-3.2.0-27-generic* linux-headers-3.2.0-29-generic* linux-headers-3.2.0-30-generic* linux-headers-3.2.0-31-generic* linux-headers-3.2.0-32-generic* linux-headers-3.2.0-33-generic* linux-headers-3.2.0-34-generic* linux-headers-3.2.0-35-generic* linux-headers-3.2.0-36-generic* linux-headers-3.2.0-37-generic* linux-headers-3.2.0-38-generic* linux-headers-3.2.0-39-generic* linux-headers-3.2.0-40-generic* linux-headers-3.2.0-41-generic* linux-headers-3.2.0-43-generic* linux-headers-3.2.0-44-generic* linux-headers-3.2.0-45-generic* linux-headers-3.2.0-48-generic* linux-headers-3.2.0-49-generic* linux-headers-3.2.0-51-generic* linux-headers-3.2.0-52-generic* linux-headers-3.2.0-53-generic* linux-headers-3.2.0-54-generic* linux-headers-3.2.0-55-generic* linux-headers-3.2.0-56-generic* linux-headers-3.2.0-57-generic* linux-headers-3.2.0-58-generic* linux-headers-3.2.0-59-generic* linux-headers-3.2.0-60-generic* linux-headers-3.2.0-61-generic* linux-headers-3.2.0-63-generic* linux-headers-3.2.0-64-generic* linux-headers-3.2.0-65-generic* linux-headers-3.2.0-67-generic* linux-headers-3.2.0-68-generic* linux-headers-3.2.0-69-generic* linux-headers-3.2.0-70-generic* linux-headers-3.2.0-74-generic* linux-headers-3.2.0-75-generic* linux-headers-3.2.0-76-generic* linux-headers-3.2.0-77-generic* linux-headers-3.2.0-79-generic* linux-headers-3.2.0-80-generic* linux-headers-3.2.0-82-generic* linux-headers-3.2.0-83-generic* linux-headers-3.8.0-36-generic* linux-headers-generic-lts-saucy* linux-image-3.11.0-17-generic* linux-image-3.11.0-18-generic* linux-image-3.11.0-19-generic* linux-image-3.11.0-20-generic* linux-image-3.11.0-22-generic* linux-image-3.11.0-23-generic* linux-image-3.11.0-24-generic* linux-image-3.11.0-26-generic* linux-image-3.13.0-32-generic* linux-image-3.13.0-33-generic* linux-image-3.13.0-34-generic* linux-image-3.13.0-35-generic* linux-image-3.13.0-36-generic* linux-image-3.13.0-37-generic* linux-image-3.13.0-39-generic* linux-image-3.13.0-43-generic* linux-image-3.13.0-44-generic* linux-image-3.13.0-45-generic* linux-image-3.13.0-46-generic* linux-image-3.13.0-48-generic* linux-image-3.13.0-49-generic* linux-image-3.13.0-51-generic* linux-image-3.13.0-52-generic* linux-image-3.2.0-23-generic* linux-image-3.2.0-24-generic* linux-image-3.2.0-25-generic* linux-image-3.2.0-26-generic* linux-image-3.2.0-27-generic* linux-image-3.2.0-29-generic* linux-image-3.2.0-30-generic* linux-image-3.2.0-31-generic* linux-image-3.2.0-32-generic* linux-image-3.2.0-33-generic* linux-image-3.2.0-34-generic* linux-image-3.2.0-35-generic* linux-image-3.2.0-36-generic* linux-image-3.2.0-37-generic* linux-image-3.2.0-38-generic* linux-image-3.2.0-39-generic* linux-image-3.2.0-40-generic* linux-image-3.2.0-41-generic* linux-image-3.2.0-43-generic* linux-image-3.2.0-44-generic* linux-image-3.2.0-45-generic* linux-image-3.2.0-48-generic* linux-image-3.2.0-49-generic* linux-image-3.2.0-51-generic* linux-image-3.2.0-52-generic* linux-image-3.2.0-53-generic* linux-image-3.2.0-54-generic* linux-image-3.2.0-55-generic* linux-image-3.2.0-56-generic* linux-image-3.2.0-57-generic* linux-image-3.2.0-58-generic* linux-image-3.2.0-59-generic* linux-image-3.2.0-60-generic* linux-image-3.2.0-61-generic* linux-image-3.2.0-63-generic* linux-image-3.2.0-64-generic* linux-image-3.2.0-65-generic* linux-image-3.2.0-67-generic* linux-image-3.2.0-68-generic* linux-image-3.2.0-69-generic* linux-image-3.2.0-70-generic* linux-image-3.2.0-74-generic* linux-image-3.2.0-75-generic* linux-image-3.2.0-76-generic* linux-image-3.2.0-77-generic* linux-image-3.2.0-79-generic* linux-image-3.2.0-80-generic* linux-image-3.2.0-82-generic* linux-image-3.2.0-83-generic* linux-image-3.8.0-36-generic* linux-image-generic-lts-saucy* 0 to upgrade, 0 to newly install, 149 to remove and 1 not to upgrade. After this operation, 13.0 GB disk space will be freed. Do you want to continue [Y/n]?

Results

After I’ve choosen yes, it took quite a bit but at the end it was worth it

mattions@triton:~$ df -h
Filesystem               Size  Used Avail Use% Mounted on
/dev/sda1                 46G   28G   17G  63% /
udev                     2.0G  4.0K  2.0G   1% /dev
tmpfs                    396M  1.1M  395M   1% /run
none                     5.0M  8.0K  5.0M   1% /run/lock
none                     2.0G   37M  1.9G   2% /run/shm
cgroup                   2.0G     0  2.0G   0% /sys/fs/cgroup
/dev/sda6                176G  166G  1.7G 100% /home
/home/mattions/.Private  176G  166G  1.7G 100% /home/mattions

And the Boot partition is just 70 Mb:

mattions@triton:~$ du /boot -sh
70M /boot

So from 45 Gb to 25 GB, and the boot partition itself from 2.3 Gb to 70 Mb.

Pretty good I think.

© 2020 Train of Thoughts

Theme by Anders NorénUp ↑

By continuing to use the site (scrolling or clicking counts), you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close