I currently ftp all my files to my website when i do an update (over a slowish adsl connection)
And I want to make things easier, so I just recently started using a hosted svn service, and i thought i could speed things up a bit by doing an svn export of my website directly onto my webserver
i have tried that a few times and it seems to work ok, however it does fetch the entire site everytime which is a bit slow for a 1 file update
so my questions are
is it possible to do an export and only get the changes since the last export (how will this handle deleted files ?)
OR will it be easier to do an svn checkout and svn update it all the time instead of svn export and just hide the .svn folders using apache htaccess
is this a good idea, or is there a better way to publish my website
i am trying to achieve the 1 click deploy type ideal
maybe there are some gotcha’s i haven’t thought of that someone else has run into
I would do an svn checkout, and have done so successfully on a live site for a number of years. You should add
mod_rewrite rules to 404 the .svn directories (and files) though.
This is what I’m doing on my host:
For every project I have a structure that looks more less like this:
First dir is a checkout from SVN, while second one is just svn export.
I have a small bash script
#!/bin/bash SOURCE="$HOME/projects/" TARGET="$HOME/public_html/" for x in `ls $SOURCE` do if [ -d $SOURCE$x ]; then svn update $SOURCE$x svn export --force $SOURCE$x $TARGET$x fi done
Export is done from working copy so it’s very fast.
It might not be exactly the answer you are looking for, but, if you have an SSH access to your webserver (it depends on your hosting service ; some “low cost” don’t give such kind of access), you can use rsync to “synchronise” the remote site with what you have on your disk.
In the past, I was using something like the idea you are describing (fetching the svn log between last revision pushed to production and HEAD, analysing every lines, and in the end calculating what to send to the server) ; but it was not really a great process ; I now use rsync, and like it way better.
(Here too, you will have to exclude .svn directories, btw)
You can just accept having .svn directories in your website (generally not a problem esp. if you configure it not to permit access to these) – this is the easy option. Alternatively, do what RaYell does, and have two copies of your website on the webserver. One normal checkout outside of the web-directory, and one in your web-directory. When you update, simply export the svn (just a copy with .svn dirs deleted) into the web directory (and you should make sure to first delete old files if you wish to avoid files that have been removed from SVN from remaining on your website).
I do something like this, using robocopy set to mirror the svn checkout while excluding .svn directories, and get both the export and the old-file deletion in one step, thus minimizing downtime if the copy takes long. I’m sure this is easy on unix too, if that’s your hosting environment. For example, you can use a local rsync: http://blog.gilluminate.com/2006/12/12/yes-you-can-rsync-between-two-local-directories/
Old topic but since this is what came up in Google during my research I thought I would add to this. I recommend doing an export to the site instead of checking out to it. I like to keep the repositories and the sites separate. I also don’t recommend exporting the entire repo to the site each time, specially if only a few file change at a time. What you can do instead is do a diff on the repo to see what’s changed from a release to another and only export those files. More info at: http://www.joeyrivera.com/2011/automate-svn-export-to-site-w-bash-script/