Recently I have changed my backup solution from SpiderOak to Tresorit. I have been very happy with SpiderOak since I started with them around 2009, But last year backups and sync started to fail. Eg: backups taking ages or not finishing at all, etc. Also support response time was not good enough and didn’t find a proper fix for my problems, so finally I decided to move my business elsewhere. The chosen one was Tresorit, a Swiss based company that offered two things important for me de-duplication and client side encryption.
Both solutions works in Linux but Tresorit needs a GUI to work (SpiderOak support a headless mode). This was a problem as I wanted to run the Tresorit client in a headless VPS servers. To add a kind of pseudo headless support to the Tresorit client I decided to use the Xpra software a multi-platform (Microsoft Windows, Linux, Mac) screen and application forwarding system or as they say in the web page “screen for X11”.
During the last few weeks I have been interviewed for several DevOps positions. In two of them I had to reply a skills check-list and in the other one an exercise to be solved and send back by email. I think these check-list interviews are not good for DevOps positions, specially if the check-lists used are not updated properly. Let’s see why…
Sometimes you have to deal with servers that you don’t know anything about:
- You are a short temp IT consultant with not previous knowledge on the environment.
- The CMDB is out of order.
- You are on a DR situation.
- Or simply the main administrator is not there.
And you need:
- Run commands in parallel
- Get info from many servers at a time
- Troubleshoot DNS problems
- Check how many servers are up and running
On my systems I use two orchestrators: MCollective and SaltStack (configured automatically using puppet) that fulfill my needs. But let’s see a quick way to have an orchestrator in a rapid manner.
I have been working with DigitalOcean for several months, on average DigitalOcean deploys your VPS server in 55 seconds. After the server is deployed, all the manual/prone to errors/boring configuration process is needed.
As I am using puppet to configure all my servers I have create provisioningDO rakefile script (based on John Arundel’s book Puppet 3 Cookbook) to deploy and configure my servers in 4 min 15 sec. It means After 4 min 15 secs, my servers are ready for production.
provisioningDO uses Jack Pearkes’ tugboat CLI tool so, a fully installed and configured tugboat CLI is necessary. It shouldn’t take you more than 5-10 minutes to have a working and ready to go tugboat installation 🙂
Today I have released to the public my first puppet module:
It installs and configures knockd (a port knocking software).
Several months ago I finally got the (WOL) Wake On Lan feature of my RTL8111/8168B NIC card working. The problem was that a new driver (other than the provided by Debian) and a special PCI configuration was needed.
The other problem I had to deal with was the ADSL Router (Comtrend HG532c, The one provided by the Spanish ISP Jazztel) configuration:
- Open the required port: This was an easy one just opening the 7 a 9 port and forwarding them to the server we want to WOL from the internet
- Make the router remember the server’s tuple MAC/IP address. That was easy too, but some manual work was needed as when router is restarted the ARP table is flushed. 🙁
In my current job I had to change recently some configuration and restart more than 600 IP phones. To perform such titanic task I created a quick and dirty script using expect. It worked like a charm and made me think about automatize the way I set the ARP table in my Comtrend HG532c ADSL router.
1 year ago I couldn’t get connected to my office’s network using my VPN client. The reason was that my p12 certificate was expired. AFAIK IPsec cannot renew certificates automatically as windows VPN client does. To make it work I needed to renew it using the windows client and then migrate a p12 certificate to a Linux/IPsec friendly format. As I was in a little hurry I tried installing the Linux Citrix client
Today I have assisted to the Ubuntu Cloud Webcast, Presented by: Mark Shuttleworth (Canonical Founder) and Stephen O’Grady from Redmonk.
Since 2005 I have hosted this web page in the Cpanel based Bluehost company. First with Joomla and recently migrated to WordPress.
Bluehost allows to download a daily, weekly and monthly backup from your Cpanel control panel, but manual intervention is needed:
- Logon in the control panel
- Navigate to the backup page
- Perform the backup
- Download it to your local computer.
This is a manually/time consuming task and of course you should not forget it!!
In this post I gonna show my automatic method to backup files and databases using:
- Crontab for automatic backups.
- Public/private keys for passwordless ssh connections.
- Rsync command for synchronizing directories between remote and local servers. This way bandwidth is reduced as if a file has already been copied to the local server no data transfer is needed.
- Mysqldump for dumping the MySQL databases to a local file.
- SpiderOak for data deduplication and remote backup.
Some previous knowledge is needed to understand how it works, anyway there are some useful links to understand it. 🙂