Fox row was down for a while due to some issues with the old web host. It’s on a new and improved hosting service now, so there shouldn’t be any more problems.
The Bracket is out, and once Virginia was announced as the final 1-seed, I realized Wisconsin may be in a unique position. The Badgers beat both Florida and Virginia during the regular season, and have the possibility to beat both Arizona and Wichita State in the tournament. If they do so, they will have beaten all four 1-seeds this season. I had to find out if anyone ever has. There have been 116 1-seeds since the field expanded to 64 in 1985. Naturally, all the 1-seeds are excluded by virtue of being unable to lose to themselves.
It turns out there have only been 2 teams that have 4 wins against 1-seeds in the same year, and they’re both Arizona squads. In 1997, the Wildcats beat North Carolina twice, Kansas, and Kentucky en route to the national championship. They never played Minnesota, the other 1-seed. In an even crazier 2001 season, they came within a game of doing it: they split 2 regular-season games against Illinois before beating them in the tournament, split regular season-games against Stanford, and beat Michigan State in the Final Four. The last 1-seed was Duke, who they lost to in the National Championship. They played 7 games against the 1-seeds that year!
There have been 7 teams with 3 wins against 1-seeds (what was it with the Wildcats those 4 years?):
1992 Southern California
The distribution falls off rapidly after that: there have been 64 teams with 2 wins against 1-seeds and 352 with 1 win.
The all-time leaders contain no surprises. If we assume that most of the 1-seeds come from historically “power” conferences, being in the same conference provides more opportunity over the course of a season to play against them. Going deep in the tournament doesn’t hurt either.
On the flip side, Wake Forest has the dubious honor of most losses to 1-seeds in a year. In 2002 they also played 7 games against the eventual 1-seeds. They went 0-7, losing to Cincinnati and Kansas once each, Maryland twice, and Duke 3 times. There have been 16 5-loss teams and 73 4-loss teams. In terms of all-time futility, NC State takes the cake. Since 1985, they have a paltry .114 winning percentage against 1-seeds:
So if the Badgers pull it off, they’ll be the first in the 64+ team era to do so.
Stats courtesy Sports-Reference.
It’s been a while since I’ve posted any updates for pypeline. I’ve been recently getting familiar with OpenCV, which has a great feature recognition API. I’ve ditched PIL/pillow in favor of OpenCV, and it’s looking promising. Also I was getting sick of dealing with the .NEF files output by my camera, so I’ve switched over to using Dave Coffin’s dcraw. It’s great for converting just about any RAW filetype into tiff. The code has undergone substantial changes, it’s still available publicly at github. The only downside is OpenCV is currently only compatible with Python 2, so I guess py3k is out for now. Now I just need to get some decent photos so I can start working with them…
For the fun of it, I made a twitter bot. It searches for usages of the phrase “in a world where” and combines them. The results are occasionally comedic, poignant, and nonsensical.
In a world where you can be anything, nothing is in your control
— In A World Where (@TweetInAWorld) January 13, 2014
In a world where there is no drama, my cat pictures only gets five likes
— In A World Where (@TweetInAWorld) January 12, 2014
In a world where everyone hates me, everyone does this one thing
— In A World Where (@TweetInAWorld) December 28, 2013
I have found occasionally tweets get parsed incorrectly, and there are a few phrases I’ve seen repeated. Apparently the pool of “in a world” tweets isn’t all that large. It’s a little hit-or-miss, but occasionally turns up good ones. I built it with Python and tweepy. It wasn’t difficult with the tweepy API, and the learning experience was fun. You can follow it at @TweetInAWorld.
I’ve pushed some changes to the pypeline repository, adding basic stacking functionality. There isn’t any registration, it only takes the median of each channel (R, G, B) for each pixel. It’s currently way slow, but I suspect there is substantial room for improvement there. Wrangling NEF files has proven more difficult than I anticipated, so currently the state of the art in pypeline is JPGs.
My camera is rated down to 32°F, and nightly lows have been around 0, so I’m scared to take it out into the elements. On the plus side, the stacking works with regular images too! Any particular pixel just needs to have the “right” value for at least half the shots.
There is a little ghosting, but quite good considering, I’d say. I am not sure how to get rid of that totally. More pictures should quash the error, but at 5 I would have thought it would wipe out any traces of the marker. Also, a better algorithm should be able to push down the > 50% requirement to only a plurality. Maybe with some sort of clustering of values? I’m also taking the mean of each channel independently, maybe a better way would be to use luminance. In any case, baby steps!
Inspired by this post at DataGenetics, I implemented a quick-and-dirty script in python to test it out. The first takes an input image and iterates over it pixel by pixel, splitting it into two output images. Ideally, the outputs are randomly assigned, so it is impossible to recover the original without both outputs. The outputs can be combined to recover the original. Here’s an example of it in action:
Intermediate images (hopefully look like static):
Not perfect, but it is definitely recognizable. The idea can apparently be extended to 9×9 (and 16×16, and 25×25… I presume) images, for a wider-shared secret. In any case, this scheme should make it possible for any number of people to share a secret, but none of them individually can recover it. I uploaded the code here on github.
I’ve been a little busy lately, but should be able to pick things up again with the new PC I built. Made a time-lapse of assembling it too:
I’ve just enabled SSL on the site. Technically it’s TLS, but the name SSL seems to be sticking. Now you can access it at https://www.foxrow.com (note https). The not-encrypted version should still be available at http://www.foxrow.com. I’m not sure with Heroku hosting the apps if I can get it set up for ergs and weather, still looking into that.
Apparently PIL, and therefore Pillow, do not support Nikon RAW (.NEF) files. From what I can tell, my camera shoots RAWs in a 14-bit grayscale format. With the help of nefarious, I’ve found a way to get the image data out of my RAW files. Images coming soon!
As part of my ongoing effort to consolidate my various projects onto foxrow.com, I have moved the rowing weather app to weather.foxrow.com. Eventually I hope to get some interactive form of pypeline online as well. I am not sure yet what form that might take. Because RAWs are so big, uploading enough to get a reasonable outcome may be prohibitively bandwith-intensive. Also processing power on the server side may prove too big a bottleneck, but I won’t know until I try it.