Category Archives: tutorial

Results of WordPress Ninja Forms entries as JSON

Ninja Forms is rather nifty WordPress Plugin for forms. The main problem I have with it at the moment is that’s a rather mess in terms of data structure and getting data out of it. One can grab CSV file, but doesn’t really help you if you want to make a nice front-end.

So here’s a snippet that will dump your current form results in a way that you can further display them with AngularJS or similar.

How to organise and synchronise production WordPress with local development environment

In the last year I’ve either deployed or inherited about 10 new WordPress installations and managing them became a mess that quickly ate too much of my time. It seems that quite a few of my friends have the same problem – so here’s a quick overview on how to approach it.

Everything I describe here can definitely work on OS X or Linux and probably on Windows as they’re all either PHP or Python based tools.

Keeping up with updates

Clients don’t update their plugins or WordPress itself and when they do they won’t read changes clearly enough to be able to judge if upgrade would break something. I use InfiniteWP for this. It’s a standalone PHP installation that connects to your WP’s via InfiniteWP Client plugin. It’s free, with some commercial add-ons. You can set it up to email you when there are new updates and support remote backups of your sites, which will be useful for later stages.

From security standpoint, it’s definitely not optional, but at the moment – not updating seems a greater risk.


Local development environment

For each client’s site, I would have a local copy running on my computer. Depending on your preferences you might be using something like MAMP of XAMPP that packages MySQL, PHP and Apache server together. One thing to watch out is that you’re running your local development under the same major version of PHP as it’s often source of bugs (as my local PHP would support newer syntax than the one on server).

For each site, I would have a local alias – http://sitename.local/ to ensure that I don’t accidentally change things on production.

For things I would develop, usually a theme and an extra plugin, I would store them in git to keep revision history and feature branches.

I have yet to find a good way to version plugins, so for now the tactic is to try to keep up with latest versions of plugin and use them as little as possible and only from developers that have release blogs and sane release tactics.

Synchronising production to local environment (manually)

Sometimes I don’t have shell access to server – in that case I would use either InfiniteWP to generate a database dump (from InifniteWP dashboard) or UpdraftPlus from within WordPress dashboard.

Locally, I would then use wp-cli to reset local database:
wp db reset
and import new database:
wp db import sitename_db.sql

wp-cli supports local path substitutions, but it’s usually not needed. What I would do is modify my local wp-config.php to have:


This allows me to use copy of production database, without WordPress redirecting my logins to production URL.

For contents of wp-content/uploads I usually don’t bother as I can easily fix things without seeing images in last few blog posts.

Synchronising production to local environment (automated)

For the sites where I have shell access and can install wp-cli on server, I have ansible scripts (more on that later) that run:
wp db dump
locally and then copy it to my dev environment where they import it using wp db reset and wp db import combination.

This means that I can sync production to my local environment in less than a minute, making it a no brainer to test and tweak things locally and not on production.

Applying changes to production

For themes and custom plugins for sites where I only have FTP access, I’m using git-ftp that allows me to push to FTP server using git ftp push. It keeps track of which revision is on server and updates only the difference. It does mean that you never change things on server directly, but have to go through committing to git first (which I consider a good thing).

For environments with shell access you can just ssh and then use git on the other side to pull in changes. It works, but it’s a couple of additional steps.

Lately, I’ve automating these tasks  with Ansible playbooks that allow me to have simple scripts like:

- hosts: server1
  sudo: no
    - name: update theme
      git: repo=git@server:themename.git dest=/home/username/sitename/wp-content/themes/themename

or to grab database dump

- hosts: server
    - name: wp db dump
      command: /home/username/.wp-cli/bin/wp db dump /home/username/tmp/sitename.sql chdir=/home/username/sitename
    - name: copy db to ~/dbdumps/
      local_action: command scp servername:tmp/sitename.sql /home/username/dbdumps/sitename.sql
      sudo: no

Which can then be easily extended or in a separate playbook file drop local database and import new copy. To run these playbooks you would just use ansible-playbook dbdump.yml and similar and it gives you a full report of what’s happening.

For bigger and more complex setups you would extend to support rollback and different revision models, but that’s beyond scope of my current WordPress projects.


Scripting these tasks always seemed as something not worth doing as they were just a couple shell commands or clicks away. But as number of projects grew it became annoying and much harder to remember specifics of each server setup, passwords, phpmyadmin location and similar.

With having things fully scripted, I can now get a request from client, sync whatever state of their WordPress is at the moment, automatically in just a minute, and see why theme broke on a specific article. It saves me crazy amount of time.

At the moment I’m trying to script anything that I see myself typing into shell more than 3 times and so far it was worth it every time as these scripts suddenly become reusable across different projects.

10 Basic tools of social media presence for organizations

Recently, I’ve been thinking about what it takes to build a social media presence of organization. More specifically I’m taking a look at Kiberpipa, small non-profit public space that organizes events and provides a semi-public space for open source creativity. In a more generalized fashion, it can apply to all startups that want to have presence in a modern Web 2.0 space.

Facebook Model
Image by Bryan Veloso via Flickr

1. Official blog. It doesn’t matter if it’s hosted or ran elsewhere on some blogspot subdomain. Make it easy for your fans to follow your human voice. You can also just post cute pictures, like Flickr blog does.

2. Facebook fan page/group. So you can count your followers easily and massively notify them of events and happenings around your brand. I like the way Kiberpipa Facebook group is actively managed in this regard, always inviting people to events.

3. Twitter account. Just have a quick and lazy way for someone to follow news about your company, be able to @brandname and a way, if you want, to officially reply. I love the way Six Apart highlights changes in their services and promotes their users.

Got Your's Yet?
Image by Chandra Marsono via Flickr

4. Flickr group and official account. Make all the pictures in there Creative Commons or public domains, to make it easy for bloggers and others to build on top of your brand and spread a good word. Great example of Flickr search results would be startup Spotify. It took me 2 seconds to find a cool post to go with my blog post.

5. Vimeo or YouTube channel. Make it easy for others to emeeded and once again, blog and share your video material. If you’re a boring traditional Web 2.0 startup, you have at least some screencast to show. Mozilla Firefox Vimeo channel is definetly going into that direction, as is Brightkite.

6. Dopplr Group. Even though they’re still in early beta, you should make sure that everyone in your organization that ever travels, is in your group. Then you can stick a widget of your presence on your blog and have the cheapest way for you to announce your presence in a certain part of the world. With greater idea that this means more potential feedback opportunities or potential business partner meetings. No examples yet, but it should happen soon in a few week as they roll out the service in full. shirt, thanks Neil! - Image1447
Image by roland via Flickr

7. Upcoming listing. If you are organizing an event, a party or just semi-ad-hoc meetup, you have to add it there. Often people figure out what’s happening in that city on the same day or day before, so having an even in a group like Web 2.0 geeks, should give a few more visitors. Startups and events organizers to imitate, are all that are in Web Conference Junkies and similar (at least for London and San Francisco).

8. Get Satisfaction for support forums, if you run some kind of service or make a product. For a while now, we’ve been collectively teaching users how to get support there, so it makes sense for you to piggyback on this. If you don’t they’ll create a group for your product anyway, and you’ll just be forced to go there. Classical example of something wonderful coming from this, is Timbuk2 group on GS, that actually prompted them to create a diaper bag, after enough people expressed interest.

Ma.gnolia t-shirt
Image by Damien Tanner via Flickr

9. Ma.gnolia or for social bookmarking. As you get more popular and your team grows, you’ll want an easy way to track buzz and then pull it in into your web page. Having everyone in team add mentions under pre-arranged tags into a group allows for a great collaborative buzz-tracking experience. Examples? OpenDNS blog buzz and Zemanta on Ma.gnolia. For more details on this technique, see my post – Online buzz management for startups.

My MOO mini cards
Image by Robyn Gallagher via Flickr

10. Moo cards. You’re not a proper social media enabled company, without these signature cards. The ones that have them will instantly connect with you, while you can enlighten others.

11. Whatever works for you! You community might be more into then Upcoming, go with that also. In Slovenia I’d go for Koornk on top of Twitter, while in UK I’d defintly consider Bebo vs. MySpace if I’m in music space and younger Y Generation members.

    Reblog this post [with Zemanta]

3 easy ways to do online interviews

Image by hz536n via Flickr

Gathering qualitative data, like interviews, presents an interesting alternative to classical online surveys. While in survey gathering we can setup an online environment, and leave it running for longer time without our strict supervision, in interview setting a real-time component is really important (unless you want to do a bit longer interview over e-mails).


Going the voice way


Our first option is to emulate phone interview, by using one of the IP/internet based – VoIP alternatives. Most well-known would certainly be Skype, freeware multi-platform Voice/Video/Chat system. It should be fairly easy to setup an interview session using this medium. You can even call them to a land-line or mobile phone and use one of the popular Skype recoding utilities to make a digital copy of the interview.

Skype Limited

Image via Wikipedia

Afterwards you should use a dedicated transcribing tool, to easily make a transcript of your session. Do note that you really want to have a specialized utility for transcribing, since it allows you to slower down the recording, to the speed of your typing, pause and rewind easily with keyboard shortcut and a lot more.


Having a chat


Other alternative to voice or possibly video, would be asynchronous-chat. While it isn’t as involved as interactive voice, it gives more ease to interviewee since they can take time to think of their answer, have someone else in a room, etc. The same luxury is also present on your side of the interview.

The tools of the trade would usually be chat applications like Microsoft Messanger, Google Talk, Skype, etc. The issue here is that connecting using these tools can present serious invasion into privacy of other user (as you now know their screen name), it also means that the person will be online at the time, making probability that some-one else will want to chat to her, much higher. It also turns out that a lot of people online still don’t fully understand the Instant messaging, so this might lead to a serious of confusions and problems (especially if the person interviewing is not technically most skilled).

My proposed solution to this would be to use a tool like Campfire from 37signals, to setup a web based chat room, into which the interview participants would then be invited it. This is also good tactic for online focus groups, as it allows for easily connecting a number of people without worrying about their different screen names and incompatible applications. 

There are a number of free alternatives that could be used to setup a Campfire like environment, one way would be to use Moodle with a chat module where you would treat participants as students and conduct chat sessions with them.


Online worlds


Second LifeImage via Wikipedia

Second Life is today a de-facto standard for a virtual world. If you have luxury to place your participants there, you can also study their gestures and general body language while talking or chatting to them. The only problem is that it’s quite a complex environment where all participants need to beforehand know how to navigate it.

Easier (but less powerful) alternative could be Google’s 3D world called Lively that allows a bit more freedom like traditional chat room, while still having limited enough controls to be within reach from technically less-savy participants.

While conducting interviews in online worlds looks like a great idea at the start, it usually turns out that you’ve severely limited the amount of people who can participate because of hardware requirements, internet speed and technical knowledge. Depending on the type of research you do, this can present a serious set-back once you start your work.


General observations


Doing online interviews is demanding task that will folly occupy you. You should expect that each interview (esp. unstructured) could easily take you a few hours of involved typing. Afterwards you should also take time to go through the interview, adding the notes and studying responses so you’ll have easier time later analyzing it.

Also don’t forget to take time in the beginning to do a dry-run full interview with someone you know and trust so you can learn the technology and get general idea of things that could go wrong.

I mentioned just a popular few applications for doing the tasks. There are numerous alternatives with their own specific, but in this context I felt discussing them would add noise to the post.

Zemanta Pixie