Monthly Archives: August 2013

MATLAB in OS X Mountain Lion

I needed to write an optimization function for college and came across this problem today: if you use OS X Mountain Lion (I think the problem also happens in Lion and < 10.6), your MATLAB should stop working correctly after a Java update that occurred in June. Well, Java.

The problem lies with a bug packed in the Java Security Update released by Apple. For some reason, the corresponding fix isn’t automatically downloaded by the App Store, so we must do it manually1. This bug interferes with MATLAB’s graphical user interface, making it unusable.

It’s pretty easy: go to and download the update. Then, just install and open MATLAB.

Another “solution” is to run MATLAB without its GUI by using:

Where $MATLAB is the installation directory, for example in /Applications/

Moral: don’t leave some computer-based homework for the last day when it depends on Java.


On the verge of procrastination

Procrastination is… jumping from an idea to another.
Johnny Kelly

I’m currently procrastinating. And it hurts, much more than it should.

This post is the result of my shallow research on the topic of procrastination mixed with the desire to avoid doing something else (like writing my dissertation).

Counterproductive, needless and delaying tasks

This triad of adjectives is the constant companion of college students. You try to focus, but an invincible foe keeps pushing you against a wall. Nothing works. You realize you should’ve studied more for that test, and to all the other ones you had since your freshman year.

Inside this whirlpool, you begin to wonder why you were allowed to continue. And then you get anxious because the job market is fucked up and no one is there to help you besides yourself. Competition against your peers. Mostly unfinished tasks multiply. And you miss your first deadline, then you have to talk to a bad mooded professor without a drop of consideration.

Finally, you begin to do something entirely different. Facebook, Twitter, Tumblr — those islands of pure ego — or anything else, really, in order to avoid what you ought to do. At this stage you’re writing lists, large lists, gigantic lists, and then you realize none of them is going to get completed. Ever. And panic strikes.

You feel weak. Now you’ve graduated, and no better situation awaits. Your na├»ve notion of perfectionism attained nothing but frustration and sleepless nights.

What should you do?

Some researchers1 suggest that counterproductive, needless and delaying tasks are the necessary and sufficient conditions to categorize some comportament as procrastination. I disagree. Sometimes, it’s necessary to “procrastinate” (by doing these tasks) in order to create interesting things. So, when is it bad?


A common attribute among procrastinators is perfectionism.

Generally, one is taken as a perfectionist if s/he “tries to do everything right”. A more descriptive set of variables3 include: high standards, orderliness and discrepancy between her/his achievements and standards. The last item is mostly responsible for the problems attributed to perfectionism.

In fact, people who rate high on our discrepancy scale also rate high on scales measuring depression, shame, anxiety, and other negative psychological states.

Robert B. Slaney3

So the troubling situation is if you want to achieve more but can’t actually do it. Then you begin to realize that if you do nothing until the last minute, you won’t be blamed for not having skills, but for being lazy. And you begin thinking this is alright. “I feel more productive doing it the night before, overloaded with coffee”. As far as coping mechanisms goes, this is bullshit.

Can someone escape from this spiral after entering it? Or s/he is condemned to a life of self-hatred, unsatisfied in every waking hour? That’s… a good question. The answer might be in identifying what kind of “mindset” typically generates procrastination.

Losing yourself in doubts

With important and potentially negative outcomes linked to procrastination, why would a student choose to procrastinate?

Jeannetta G. Williams et al2

I and most of the procrastinators I know of are students, so restricting this discussion to this group isn’t so bad an assumption. (as a matter of fact, we’re pretty good at it).

There are two opposite mindsets2, each very (negative or positively) correlated with procrastination tendency. The first, called mastery-oriented, is defined by a strong desire of learning for its own sake, unconcerned with grades. The second, performance-oriented, is marked by studying to “win”, as the name implies. The latter is obviously much more afraid of failing than the former.

This situation is unsustainable. Getting anxious over the fact — an immutable one, considering a student — that you won’t understand or be good at something is painful.

Another problematic factor is the “big push effect” before a deadline: if you have a semester to do it, why the heck did you wait until the last week?! Coffee, awful nights and a constant fear of not being able to finish, all this due to some afternoons and nights on the Internet, doing nothing.

On the other hand, doing things for their intrinsic value is so much better that there’s a whole area of research devoted to it — the so-called [Optimal Experience or simply Flow](


My first impulse to write this article appeared right after I finished “What are BLAS and LAPACK“. Considering that I still have to write lots of things for my senior dissertation, I was procrastinating by writing about procrastination. Wonderful.

I don’t have much to say before turning this into an autobiographic “I was a much worse procrastinator, now I’m just an average one!” or a self-help post. However, I’ve learned a good deal about the subject and it might be useful in some parts of my work. I hope you learned something as well.

During my research, one of the best resources I found was the video below. While not scientific, its rhythm, images and words are stunning.

Procrastination from Johnny Kelly on Vimeo.

It’s unsettlingly precise.


  1. Schraw, Gregory; Wadkins, Theresa; Olafson, Lori. Doing the things we do: A grounded theory of academic procrastination. Journal of Educational Psychology, Vol 99(1). link.
  2. Jeannetta G. Williams et al. Start Today or the Very Last Day? The Relationships Among Self-Compassion, Motivation, and Procrastination. American Journal of Psychological Research, Volume 4, Number 1. October 20, 2008. link
  3. McGarvey, Jason. The Almost Perfect Definition. Seen on 08/24/2013. link.


What are BLAS and LAPACK

I wanted to write about this subject since I started reading about NMatrix.

At the beginning, the names BLAS, LAPACK and ATLAS confused me — imagine a young programmer, without formal training, trying to understand what’s a “de facto application programming interface standard” with lots of strangely-named functions and some references to the ancient FORTRAN language.

As of now, I think my understanding is sufficient to write about them.

Meaning, definitions and madness

BLAS (Basic Linear Algebra Subroutine) is a standard that provides 3 levels of functions for different kinds of linear algebra operations. Consider \alpha and \beta as scalars, x and y as vectors and A, B and T (triangular) as matrices. The levels are divided in the following way:

  1. Scalar and vector operations of the form y = \alpha * x + y, dot product and vector norms.
  2. Matrix-vector operations of the form y = \alpha * A * x + \beta * y and solving T * x = y.
  3. Matrix-matrix operations of the form C = \alpha * A * B + \beta * C and solving B = \alpha * T^{-1} * B. GEMM (GEneral Matrix Multiply) is contained in this level.

There are several functions in LAPACK (Linear Algebra PACKage), from solving linear systems to eigenvalues and factorizations. It’s much better to take a look at its documentation when you’re looking for something specific.

A bit of history

BLAS was first published in 1979, as can be seen in this paper. An interesting part of it is the section named Reasons for Developing the Package:

  1. It can serve as a conceptual aid in both the design and coding stages of a programming effort to regard an operation such as the dot product as a basic building block.

  2. It improves the self-documenting quality of code to identify an operation such as the dot product by a unique mnemonic name.

  3. Since a significant amount of the execution time in complicated linear algebraic programs may be spent in a few low level operations, a reduction of the execution time spent in these operations may be reflected in cost savings in the running of programs. Assembly language coded subprograms for these operations provide such savings on some computers.

  4. The programming of some of these low level operations involves algorithmic and implementation subtleties that are likely to be ignored in the typical applications programming environment. For example, the subprograms provided for the modified Givens transformation incorporate control of the scaling terms, which otherwise can drift monotonically toward underflow.

So it seems we still use BLAS for the reasons it was created. The paper’s a pretty good read if you have the time. (and if you don’t know what’s a Givens transformation, read this)

LAPACK was first published in 1992, as can be seen in the release history. By reading the LAWNs (LAPACK Working Notes), we can get a pretty good picture of its beginning, e.g. papers that presented techniques which were later added to it and installation notes (with sayings of the sort “[…] by sending the authors a hard copy of the output files or by returning the distribution tape with the output files stored on it”).


There are various implementations of the BLAS API, e.g. by Intel, AMD, Apple and the GNU Scientific Library. The one supported by NMatrix is ATLAS (Automatically Tuned Linear Algebra Software), a very cool project that uses a lot of heuristics to determine optimal compilation parameters to maximize its BLAS & LAPACK implementations’ performance.

As for LAPACK, its original goal was “to make the widely used EISPACK and LINPACK libraries run efficiently on shared-memory vector and parallel processors” (source). Simply put, it’s a library for speeding up various matrix-related routines by taking advantage of each architecture’s memory hierarchy. The trick is that it uses block algorithms for dealing with matrices instead of an element-by-element approach. This way, less time is spent moving data around. It’s written in Fortran 90.

Another important point regarding LAPACK is that it requires a good BLAS implementation — it assumes there’s one already made for the system at hand — by calling the level 3 operations as much as possible.

Function naming conventions

One of the strangest things about BLAS and LAPACK is how their functions are named. In LAPACK, a subroutine name is of the form pmmaaa, where:

  • p is the type of the numbers used, e.g. S for single-precision floating-point and Z for double-precision complex.
  • mm is the kind of matrix used in the algorithm, e.g. GE for GEneral matrices, SY for SYmmetric and TB for Triangular Band.
  • aaa is the algorithm implemented by the subroutine, e.g. QRF for QR factorization, TRS for solving linear equations with factorization.

BLAS functions are named as <character><name><mod>, which, although similar to LAPACK’s, have differences depending on the specific level. In level 1, <name> is the operation type, while is level 2 and 3 it’s the matrix argument type. For each level, there are some specific values that <mod> (if present) can take, each providing additional information of the operation. <character> is the data type regardless of the level.

These arcane names are derived from the fact that FORTRAN’s identifiers were limited to 6 characters in length. This was solved by FORTRAN90 by allowing up to 31 characters, but the names used in BLAS and LAPACK remain to this day.

Use in NMatrix

NMatrix has bindings to both BLAS and LAPACK. Let me show you:

If you want to take a look at the low-level bindings, grab some coffee and read the ext/nmatrix/math/ directory. Since 8f129f, it has been greatly simplified and can actually be understood.


Below you can find a list of the main resources used in this post.