On March 8th, my Ubuntu membersip will expire. I’ve been getting email notifications for a few days and I’ve decided not to renew my membership.
On March 8th, my Ubuntu membersip will expire. I’ve been getting email notifications for a few days and I’ve decided not to renew my membership. Ubuntu introduced me to open source. Thank you for the great operating system and the sense of community that I’ve had for a few years. I’ve made a lot of friends and I’ve had a lot of mentors who’ve helped me become a better person.
I won’t disappear entirely – I will still be in a few IRC channels and help in whatever little way I can.
Today I learned something about Ansible debugging from benno on #ansible. Occasionally, commands can get stuck, especially if it’s waiting for input. You can’t fix this until you recognize what’s going on and see the prompt. In other words, you want to see the stdout and stderr on the target machine. Here’s what you do:
Run ansible with -vvv.
Login to the remote host where the command is being executed.
Find the ansible process executing the command and kill them.
The stdout and stderr should be printed to the console where ansible was running.
Centre for Internet and Society celebrated their 5-year anniversary with an exhibition at their Bangalore and Delhi offices and…
Centre for Internet and Society celebrated their 5-year anniversary with an exhibition at their Bangalore and Delhi offices and a series of talks in Bangalore. I was there on Tuesday and managed to spend some time at the exhibition and attend the talks.
The exhibition showed off some of the work that CIS has been doing and the work of several independent artists. The bits that are particularly in my memory is Tara Kelton’s work as well as Sharath’s work.
Later in the day, Lawrence Liang talked about the Encyclopedia of Indian cinema. It was a very interesting talk, especially for me since it encompasses open data, open source software, and copyright issues! A convergence of a lot of my interests 🙂 Lawrence talked about what they’ve built and the problems they’ve faced and how internet as a medium for a film encyclopedia is very powerful, but is limited by the legal issues surrounding copyright laws.
I recently had to migrate a bunch of databases from MySQL to PostgreSQL. This is the process I…
This is a series of posts on migration from Apache and MySQL to Nginx+uwsgi and PostgreSQL. In this post, I’ll be detailing the steps we took to migrate the database from MySQL to PostgreSQL, with as little downtime as possible. Please leave a comment if you have suggestions!
One-time Pre-migration Steps
UTF-8
The default encoding on PostgreSQL is SQL_ASCII and you probably want UTF-8. If you don’t know what you want, you want UTF-8 (trust me). The easiest way was to blow away the default cluster and re-create it (Thanks jacobian!)
sudo pg_dropcluster --stop 9.1 main sudo pg_createcluster --start -e UTF-8 9.1 main
Make PostgreSQL listen on all interfaces
Edit /etc/postgresql/9.1/main/postgresql.conf and ensure PostgreSQL is listening on all interfaces.
listen_addresses = '0.0.0.0'
Allow access to PostgreSQL from the old server
Edit /etc/postgresql/9.1/main/pg_hba.conf and add an entry for the old server (where 123.123.123.123 is the IP address of the old server).
host all all 123.123.123.123/32 md5
Install client libraries on the old server
We use sqlalchemy for db access and I had to do apt-get install python-psycopg2.
Creating Users and Databases
Our process is to create a user for each app and have that app’s database be owned by this user, here’s a script that automated creating the user and database.
Create user and database on the new server with the script above. Remember to set a password for this new user.
Exporting
The most worrisome bit about the whole migration was exporting the data from MySQL and importing it into PostgreSQL. We used mysql2psql and it didn’t give a lot of troubles except for the bit where floats got a little messed up. My personal recommendation is to not use real, but use numeric(7,4) with the accuracy adjusted for what you need (this particular definition is used for our lat/long definitions.
First, run mysql2psql on your command line, this will create the config file.
Now edit the mysql2psql.yml file and add your appropriate entries. Here’s what ours looked like
mysql:hostname:localhostport:3306socket:/var/run/mysqld/mysqld.sockusername:mysuperuserpassword:mypassworddatabase:mydbdestination:# if file is given, output goes to file, else postgresfile:mydb.sqlpostgres:hostname:localhostport:5432username:mysql2psqlpassword:database:mysql2psql_test# if tables is given, only the listed tables will be converted. leave empty to convert all tables.#tables:#- table1#- table2# if exclude_tables is given, exclude the listed tables from the conversion.#exclude_tables:#- table3#- table4# if supress_data is true, only the schema definition will be exported/migrated, and not the datasupress_data:false# if supress_ddl is true, only the data will be exported/imported, and not the schemasupress_ddl:false# if force_truncate is true, forces a table truncate before table loadingforce_truncate:false
When you run psql2mysql again, it will export the database mydb into mydb.sql. Before we did that, we removed this particular site from /etc/apache2/sites-enabled and restarted apache. We didn’t want the sql file to go stale as soon as it was exported. This is where the downtime starts
Importing
Copy the file over to the new server and import it into PostgreSQL with psql.
sudo -u mydb psql mydb < mydb.sql
In retrospect, I should have just imported it directly with mysql2psql. I was initially hesitant because it involved creating a user that could access that machine from outside. But I later realized I needed it anyway.
Go live!
Now change the settings on the old server to use the postgres database as the backend, enable the site in Apache and you’re all set to serve this site from PostgreSQL!
It’s been about 10 months since I’ve started working at HasGeek…
It’s been about 10 months since I’ve started working at HasGeek and it’s been an amazing few months. I’ve been part of 4 amazing conferences, a workshop, and a bunch of Geekups. Among other things, I’ve written code, organized content, and edited videos. It’s probably the most intense job I’ve ever had.
When I joined HasGeek last year, I’d committed for a minimum of 6 months. After 10 months at HasGeek, I’m moving on. I’m very exicted to announce that starting Oct 2, I’ll be working for the Open Knowledge Foundation a Data Wrangler and Web Developer! I’m very excited and looking forward to working with the amazing folks at OKFN. As Sunil pointed out, I’m now in the non-profit sector 🙂