Howto have a long itemize or enumerate split over many pages

I think it is time I start blogging about all the LaTex stuff I discover all the time. This one was a little tricky. I had a 3 page long enumeration that was bound to a table. So in latex:
\begin{longtable}{|p{\textwidth}|} \hline
\item ....
\repeat 500 times
But of course the enumeration could not be split so it just went off the bottom. What to do? After some thinking I came up with this bodge. If you know a nicer way of doing this please tell me. 
I defined a new command \breaktable
which I now enter every time I see a sort of logic end to the enumeration. This saves the current counter stops the enumeration adds a new cell and starts the enumeration with the correct value again. You have to define
somewhere. Works quite nicely.

Redtube title analyzer

If you are part of my family, a lecturer or a future employee do not continue to read !!!
The following contents is for amusement purposes only and I am not responsible for anything. Do NOT take serious !!!

So after this warning I can assume you are a freak like me and the people I live with. I live in a geek house with 5 men and obviously we sometimes talk about porn. One of our favorite sites here is (setting it as start page and so on)  At some stage we had a discussion on what would be the most used term in the titles of this fine video material. Sitting down I though that my computer can easily find this out for me. So I wrote a little script
 1 #!/usr/bin/ruby
2 require "open-uri"
3 require 'rubygems'
4 require 'hpricot'
6 counter = 1
7 begin
8 page = open("{counter}")
9 if (page.kind_of? Tempfile)
10 ps =
11 else
12 ps = page.string
13 end
15 doc = Hpricot.parse(ps)
17 (doc/"a.s").each do |link|
18 link.inner_html.downcase.split.each do |word|
19 puts word
20 end21 end
22 counter=counter +1
23 end while (ps.index('No Videos found') == nil)

This will just scan through all the pages and return the single words of every title and exit if there are no pages left. I know you can optimize this and you could write a shell script to do it but bear with me.  So this returned a list of 39665 words out of 490 pages of titles. But this is not really interesting we want to count the words and here are the top 10 words. The first column is the repetition the second the word.
574 2
596 gets
615 with
620 the
641 fucked
726 girl
762 and
798 her
877 hot
1059 in
Who would have guest hat 'in' would be the most used word and that '2' is so often used. Everyone I asked assumed it would be some rude word.
Here is a little graph of all the words by repetition. 

You can clearly see that there are loads of words that only show up once and then their are a few words that are repeated quite a lot of times. I suppose you can analyze this far more and find out why exactly these words are repeated so many times. 

P.S. if someone can offer me hosting space I am more than happy to publish all the files I just don't want to upload them to google or my uni server as they contain quite rude words ;)

change ssh port

The server I mainly work on has been the target of many port scan attacks. Further as soon as these script kiddies notice that the SSH port is open they try all sorts of username and password combos. This is quite annoying as it clogs up the log files and if something serious happens you might not see it due to all the clutter. So we (the server users) decided to put SSH on some weird port. Now I had the problem that all my scripts and my svn repositories where relying on SSH to be the standard port. Changing this in all my machines would have been a massive pain. So I remembered that you can specify some information in ~/.ssh/config 
And here we go. This is all you need to change the port. As my home directory is synced between all my machines this change made the server available over the weird port but it still looks as if it would be the standard one to all the programs.
$ cat ~/.ssh/config
Port 54321