Once upon a time, there was a language called C, and this language had something called a
struct, and you could use it to make heterogeneously aggregated data structures that had members. The key thing to know about C is that when you have a struct called
currentUser, and a member like
id, and you write something like
currentUser.id = 42, the C complier turned this into extremely fast assembler instructions. Same for
int id = currentUser.id.
Also of importance was that you could have pointers to functions in structs, so you could write things like
currentUser->setId(42) if you preferred to make setting an
ida function, and this was also translated into fast assembler.
And finally, C programming has a very strong culture of preferring “extremely fast” to just “fast,” and thus if you wanted a C programmer’s attention, you had to make sure that you never do something that is just fast when you could do something that is extremely fast. This is a generalization, of course. I’m sure that if we ask around, we’ll eventually meet both C programmers who prefer elegant abstractions to extremely fast code.
via ŷhat | Neural networks and a dive into Julia.
Julia is an ambitious language. It aims to be as fast as C, as easy to use as Python, and as statistically inclined as R (to name a few goals). Read more about the language and why the creators are working so hard on it here: Why We Created Julia.
These claims have lead to a few heated discussions (and a few flame wars) around benchmarking along the line of “Is Julia really faster than [insert your favorite language/package here]?” I don’t think I’m the person to add to that particular conversation, but what I will say is this: Julia is fun.
A few weekends ago, I made the decision to casually brush up on my neural networks. Why? Well, for starters neural networks are super interesting. Additionally, I was keen to revisit the topic given all the activity around “deep learning” in the Twittersphere.
“There has been a great deal of hype surrounding neural networks, making them seem magical and mysterious. As we make clear in this section, they are just nonlinear statistical models”
Hastie, Tibshirani, & Friedman; Elements of Statistical Learning (2008)
Not magic, just lots of interesting (or boring depending on perspective) math.