Rowing, computers, astronomy
How to use: Hit Start/Stop to begin or pause the timer. At either the catch or the finish, hit 'Stroke'. You don't need the stopwatch running to calcuate rate. No downloads or app required!
I’m pleased to announce the launch of a project I have been working on for a while. It’s called Virtual Cox, and it’s an online tool to help rowers and rowing coaches row faster and more efficiently. The homepage can be found at virtualcox.com.
In a sentence: Virtual Cox lets rowers and rowing coaches analyze video to go faster on the water.
Elevator pitch: Virtual Cox lets rowers and rowing coaches analyze video to go faster on the water. Video can be captured via any device, and data analysis tools enable closer critique of form, technique, and consistency. Sharing tools let coaches provide feedback to athletes for later review.
Even a short amount of footage can make rough edges glaringly obvious, even things the rowers don’t realize they’re doing themselves! Despite being such a rich source of feedback to both athletes and coaches, there are often barriers to using it effectively (or at all). Like photography, the best camera is the one you have with you. My goal is to give crews the tools to take and analyze more video.
Video can be uploaded from any device, so you don’t have to bring along a video camera or mount any equipment on the boat. (Though you certainly can if you want!) Simply using your phone in the launch, GoPro up close, or video camera can take all the footage you need. Erging video works too, for those winter training sessions.
Once you’ve uploaded your video, tools allow you to slow down playback to closely review technique. For more advanced measurements, coaches and rowers can measure ratio, timing, and other metrics.
Coaches can save and easily share video and data with other coaches, coxswains, or athletes via a sharing link.
I’m always looking to hear from you! Send me an email at firstname.lastname@example.org, I read all of them, and try to get back to everyone. If it’s been a few days and you haven’t heard back from me, send me a quick ping. It’s more likely it slipped off my radar than I took a personal disliking to your email. If you’re a rower or coach looking to get faster on the water, sign up at Virtual Cox, and follow @virtualcox.
Some time ago, I wrote a brainfuck interpreter. It’s nice and all, but I thought, “you know what would really make this great? More inscrutable instructions!” How to extend it, though? It already has such a great feature set. You may remember it from hits such as:
Despite that saliva-inducing list, one item you can never get enough of in computing is recursion. So let’s add functions! Brainfuck already has an array of data cells, let’s expand that to an array of functions to call as well.
The official brainfunction repository can be found at https://github.com/ryanfox/brainfunction.
A brainfunction program runs on the brain, which is an extended version of the brainfuck VM. It supports the eight standard brainfuck symbols, instruction pointer, data tape, and data pointer. In addition to this, it has an array of functions. When a brainfunction program is parsed, each line of text in the source file becomes a function. These constitute the function array, starting with the first line of text in position 0. A function is exactly one line of source code. This implies one cannot break up logical units multiple lines as is done in brainfuck.
Each function has a function pointer, initialized to 0 (the first line in the file). The function pointer is moved by
^, meaning “move the function pointer down” and “move the function pointer up”, respectively.
Each function also has a local data pointer, data tape, and instruction pointer.
To call another function, the
: symbol is used. When this occurs, the function pointed to by the current function
pointer is called. The value in the caller’s current data cell is passed to the callee. This argument is placed in
cell 0 of the callee’s data tape.
A function returns when the symbol
; is encountered or the function runs out of instructions. When a function
returns, the value in the current data cell is returned. The return value is placed in the caller’s current data cell.
The brain begins execution in the zeroth function in the function array, and continues until an error is encountered or the function runs out of symbols. All symbols except the 12 specified are ignored. When a function calls another, execution blocks in the caller until the callee returns.
The reference implementation is an interpreter written in python, which can be found here.
Example hello world:
v:vv:^^:vv:^:v:^:v:^:vv: >++++++++++[>++++++++++<-]>++++.---.+++++++..+++. print hello >++++++++++++[>++++++++++<-]>-.--------.+++.------.--------. print world >++++[>++++++++<-]>. print space character >++++[>++++++++<-]>+. print !
When executed, this will print
hello hello world world world!
Example factorial calculator:
+v[,------------------------------------------------:v:v:^^] [[>+>+<<-]>>-v:<[>[>+>+<<-]>[<+>-]<<-]>>>;]+ [>>+>+<<<-]>>>[<<<+>>>-]<<+>[<->[>++++++++++<[->-[>+>>]>[+[-<+>]>+>>]<<<<<]>[-]++++++++[<++++++>-]>[<<+>>-]>[<<+>>-]<<]>]<[->>++++++++[<++++++>-]]<[.[-]<]< >++++++++++.<
When executed, this will prompt the user for input (limited to a single character), calculate the factorial of that number, and print the output:
bf > 5 120
### Roadmap Future developments may include real error messages, and perhaps a debugger, to make it easy to inspect the state of the brain. Maybe even zanier symbols!
That’s it! Look on my works, ye mighty, and despair!
Lens flare is an effect created by light reflecting internally in the lens of a camera. Whether it’s good or bad is up to how it’s used. Some filmmakers are notorious for this (looking at you, J.J. Abrams).
Inspired by needsmorejpeg, I created an app to take user-uploaded photos and add lens flare to them. Wish your photo had more lens flare? No worries, get it automatically added for you. Just upload your photo and we’ll take care of the rest. The site is needsmorelensflare.com. Happy flaring!
SportVU is a video system made by STATS Inc, used by teams to track movement of players and the ball on the court. I looked, but couldn’t find anywhere describing how to get play-by-play data, so did some investgating myself. The great stats.nba.com has every game here. Clicking on any of the linked values in the table brings up a popup with an option ‘Movement’:
Inspecting the network request that goes out when clicking on Movement shows
us that fires off an AJAX request to
We can infer that eventid=2 refers to field goal attempts, and gameid=0021500003
likely refers to the third game in 2015. Indeed, changing the gameid to 0021400003
returns data for a game dated October 28, 2014, between the Rockets and Lakers.
Once you get the eventid of the type you want, you could theoretically enumerate all the games in a season this way, or across multiple seasons. The response comes in JSON format, and Savvas Tjortjoglou nicely detailed that in this post. Happy number crunching!