Importing a data snapshot

When you get your own instance of the OpenHatch code running, you'll discover you're missing the data that are on the main OpenHatch site. So we take periodic snapshots of the data on the main openhatch.org site.

Where you can find the snapshots

 * http://inside.openhatch.org/snapshots/

Privacy implications
We discuss some privacy implications in the privacy policy document. We do suggest people read the privacy policies when they make their accounts.

How to use a snapshot
To load a snapshot into your database, you can run this command from your local install.

Note: You must run "syncdb" and "migrate" before this will work. Read README.mkd to learn more about those commands. It should take less than one minute.

Uncompress it, and load it:

gunzip snapshot.json.gz python manage.py loaddata snapshot.json

You'll see output that looks something like this:

Installing json fixture 'snapshot' from absolute path. Installed 25288 object(s) from 1 fixture(s)

You can test that it worked by loading up your local people page and the live one. Do you have the about the same number of people? Click on these things:
 * http://openhatch.org/people/
 * http://127.0.0.1:8000/people/

More about this

 * We go through some effort to remove private information before we publish user data in these snapshots. The code for that is in snapshot_public_data.py.


 * We are now creating these snapshots once a day.


 * We don't snapshot every single table. If you find there's something that we don't publish that we should, do file a bug!


 * Known issue: If your MySQL database isn't set up for Unicode, you could get a warning/error like this: "Incorrect string value: '\xC8\x9B' for column 'first_name' at row 1". That issue can be fixed; just re-create your database as described in the README.mkd file. (If you need help, come find us on #openhatch.)


 * How it works, on the servers: On linode2.openhatch.org, a cron job wakes up daily and runs mysite/scripts/snapshot_then_push.sh