Author: nigelb

  • Hello Kobo!

    I was in the UK for 3 weeks recently, and I “made the mistake” of walking into a WHSmith bookstore. I still haven’t built up the necessary self-control to walk into a bookstore and walk out without buying a book. In this instance though, I actually did walk out without buying a book. Instead, I bought a KoboTouch. I really had no intention of buying one when I walked into the store, I was only going to look at it. Eventually, though, a combination of a really friendly sales person, reasonable pricing, and common sense won out; and I walked out with the KoboTouch, a case, and a light. Common sense, because I’ve almost run out of space for physical books. It’s been a little over 3 weeks since I’ve bought the Kobo and I’ve read for a total of about 60 hours and finished 9 books. I think I might have a problem 😛

    Kobo Touch

    I downloaded a few books from Project Gutenburg, mostly classics, that I wanted to re-read. I hate to admit it, but I ended up pirating a few books too because, some of the books were simply priced too high . I could buy it from Amazon for a cheaper price, but they would be DRM’d. While music is most available to be bought DRM-free, ebooks aren’t there yet. How I wish I could I could compare prices on multiple providers and buy from the one that has the cheapest. I think I can buy books from any provider that sells epubs with Adobe DRM, but I’m yet to risk it. I’m sure over the next few months, I’ll buy at least one book each month. Definitely more often than I used to before I owned the Kobo since buying is so easy.

  • Mozcamp – Day 0

    Hello from Mozcamp Asia! I’ve just gotten back from the welcome event at Mozcamp. It’s been great to meet friends I’ve talked to on IRC or met at last year’s Mozcamp, and make new friends who’re at this year’s Mozcamp!

    I was at The Hub today morning after checking-in at the V-Hotel. Top priority today was setting up B2G on my laptop. I messed around with it for hours to finally learn, to great frustration and disappointment, that B2G on the linux desktop doesn’t work on Ubuntu 10.04. I know I can upgrade, but I really don’t want to do that in the middle of a conference. Well, looks like I’m not going to accomplish that Mozcamp Mission until I get home 🙁

    Later, I was at the Scape, which is the venue for keynotes and a bunch of us had volunteered to help organize the swag bags. It was great to work with Mozillians and do small things to help with the event 🙂 I was going to head over to The Hub right afterward, but then it started raining quite heavily! Of course, it’s Singapore! We’re going to hit this problem quite a few times over the next few days, it should be absolutely fun! (Although, I hope nobody falls sick). I went right back in and had a long and interesting chat with Mike Connor and Harold about Social API.

    At 7, we had the Mozcamp registration and welcome party – an absolutely fabulous time that involved meeting lots of people, eating good food, meeting my Mozcamp buddy (Amy Tsay!), and of course, the country fair. The country fair was a great way to chat with everyone and kind of get semi-familiar about names to faces. Unforgettable moment of the day: Watching Foxeh and Mozillians dance to Gangam style.

    PS: I’m trying to write a blog entry for every day of Mozcamp. I may or may not be able to pull this off, but I’m definitely going to be trying!

  • A Month of Being a Remotee

    Since October, I’ve been a remote employee, working for the The Open Knowledge Foundtion. I was nervous about being a remotee and I talked to a lot of my friends who’re remotees at Mozilla. Shout out to ashish, fox2mike, glob, and Unfocused for helping me out. I also enjoyed reading about people who wrote what their team did, particularly, RelEng at Mozilla, shout out to you guys as well! Also, The Oatmeal was right! Although, ironically, I’ve started to wake up unnaturally early after being a remote employee 😛

    The biggest fear about working from home were the distractions The most important distraction-killer is a time tracker. We use toggl for timesheeting anyway, and turning off the time tracker when I’m distracted helps. After a few times of doing that, I automatically stop myself when I’m getting distracted. I keep two Firefox profiles, one for work and one for everything else. While I’m working, the non-work profile is closed, so I can’t get distracted. I reward myself with time to look at it when I finish 2 hours of work and take a short break.

    Having good communication channels is great since we’re distributed. Every day, our team gets on a stand up call. It’s great to actually hear everyone talk about their and ask for help from the team if they’re stuck. We also have a Campfire chat room and an IRC channel (#okfn on irc.freenode.net); they keep me sane. Seriously. Speaking of sanity, on some days, the Campfire room is just a world of gifs, we’re awesome like that. There’s also the weekly notebook posts to keep track of what folks in other teams do.

    Time’s flown by so fast; 10 days ago, I finished a month here! It’s been a fun and busy time!

    PS: If you want to work with me at OKFN, we’re hiring for a bunch of positions!

  • The Migration – Part I: Database

    This is a series of posts on migration from Apache and MySQL to Nginx+uwsgi and PostgreSQL. In this post, I’ll be detailing the steps we took to migrate the database from MySQL to PostgreSQL, with as little downtime as possible. Please leave a comment if you have suggestions!

    One-time Pre-migration Steps

    UTF-8

    The default encoding on PostgreSQL is SQL_ASCII and you probably want UTF-8. If you don’t know what you want, you want UTF-8 (trust me). The easiest way was to blow away the default cluster and re-create it (Thanks jacobian!)

    sudo pg_dropcluster --stop 9.1 main sudo pg_createcluster --start -e UTF-8 9.1 main

    Make PostgreSQL listen on all interfaces

    Edit /etc/postgresql/9.1/main/postgresql.conf and ensure PostgreSQL is listening on all interfaces.

    listen_addresses = '0.0.0.0'

    Allow access to PostgreSQL from the old server

    Edit /etc/postgresql/9.1/main/pg_hba.conf and add an entry for the old server (where 123.123.123.123 is the IP address of the old server).

    host    all             all             123.123.123.123/32       md5

    Install client libraries on the old server

    We use sqlalchemy for db access and I had to do apt-get install python-psycopg2.

    Creating Users and Databases

    Our process is to create a user for each app and have that app’s database be owned by this user, here’s a script that automated creating the user and database.

    #!/bin/bash sudo -u postgres createuser -d -R -S $1 sudo -u postgres createdb $1 -O $1

    The move

    Import Preparation

    Create user and database on the new server with the script above. Remember to set a password for this new user.

    Exporting

    The most worrisome bit about the whole migration was exporting the data from MySQL and importing it into PostgreSQL. We used mysql2psql and it didn’t give a lot of troubles except for the bit where floats got a little messed up. My personal recommendation is to not use real, but use numeric(7,4) with the accuracy adjusted for what you need (this particular definition is used for our lat/long definitions.

    First, run mysql2psql on your command line, this will create the config file.

    Now edit the mysql2psql.yml file and add your appropriate entries. Here’s what ours looked like

    mysql:  hostname: localhost  port: 3306  socket: /var/run/mysqld/mysqld.sock  username: mysuperuser  password: mypassword  database: mydb  destination:  # if file is given, output goes to file, else postgres  file: mydb.sql  postgres:   hostname: localhost   port: 5432   username: mysql2psql   password:   database: mysql2psql_test  # if tables is given, only the listed tables will be converted.  leave empty to convert all tables. #tables: #- table1 #- table2 # if exclude_tables is given, exclude the listed tables from the conversion. #exclude_tables: #- table3 #- table4   # if supress_data is true, only the schema definition will be exported/migrated, and not the data supress_data: false  # if supress_ddl is true, only the data will be exported/imported, and not the schema supress_ddl: false  # if force_truncate is true, forces a table truncate before table loading force_truncate: false

    When you run psql2mysql again, it will export the database mydb into mydb.sql. Before we did that, we removed this particular site from /etc/apache2/sites-enabled and restarted apache. We didn’t want the sql file to go stale as soon as it was exported. This is where the downtime starts

    Importing

    Copy the file over to the new server and import it into PostgreSQL with psql.

    sudo -u mydb psql mydb < mydb.sql

    In retrospect, I should have just imported it directly with mysql2psql. I was initially hesitant because it involved creating a user that could access that machine from outside. But I later realized I needed it anyway.

    Go live!

    Now change the settings on the old server to use the postgres database as the backend, enable the site in Apache and you’re all set to serve this site from PostgreSQL!

  • Moving On

    It’s been about 10 months since I’ve started working at HasGeek and it’s been an amazing few months. I’ve been part of 4 amazing conferences, a workshop, and a bunch of Geekups. Among other things, I’ve written code, organized content, and edited videos. It’s probably the most intense job I’ve ever had.

    When I joined HasGeek last year, I’d committed for a minimum of 6 months. After 10 months at HasGeek, I’m moving on. I’m very exicted to announce that starting Oct 2, I’ll be working for the Open Knowledge Foundation a Data Wrangler and Web Developer! I’m very excited and looking forward to working with the amazing folks at OKFN. As Sunil pointed out, I’m now in the non-profit sector 🙂