Cruijff. RIP

It was a few years ago, perhaps this time of the year in 2009. One evening in Amsterdam, I went to a restaurant with a group of people. Just as I entered, I immediately caught the gaze of a world famous person, sitting and facing the door. I would have always instantly recognise even a silhouette of his face, any time of the day. I was surprised. I held his gaze, he maintained eye contact, we both smiled. Our reserved seats were just a couple of tables away from where he sat with his friends. So I would be walking past him. I had chanced this close to several famous people before, but none went like this one.

So I walked on and found myself just in front of him, both still looking maintaining eye contact. I then made one more step, and I extended my hand to him. He took my hand and shook it warmly. Then I said, Hi Johan, hoe gaat het? And he said to me, Uitstekend! En hoe gaat het met jouw? And I also said Uitstekend! He said Prima! Then I wasn’t sure what to say now, I definitely didn’t want to say anything silly. So I quickly said Geniet van je avond!, and he said Dank je. Jij ook! He kept looking as I swiftly continued on to my own table. He then resumed talking to his friends as if nothing had happened, every now and then he’d look in my direction. My companions were smiling and a little intrigued, they said to me, Wow, we never knew that you were friends with Johan. And I said: Actually I am not friends with Johan. This is the first time I ever shake hands with him, we didn’t even meet before. I don’t know why, maybe he mistook me for an acquaintance.

When I was a kid, people would simply say Cruijff! And nothing more, you’d see nods and various murmurs of appreciation, and that was typical. On this particular evening, I was gratified with a celebrity handshake out of nowhere. This wasn’t much, but it meant something for me. I felt humbled that someone could have been so illustrious as Johan, and yet remain so warm and down to earth in a social context, even with a complete stranger. What surprised me is that, I was quite sure that he realised his mistake, but still remained cordial when he absolutely didn’t have to. That was the sign of a great man to me.

This was my only experience meeting with Johan Cruijff. It will stay with me.

RIP.

Open source is a development methodology; free software is a social movement. by Richard Stallman

I just read a nice essay by Richard Stallman with the title Why Open Source Misses the Point of Free Software – GNU Project – Free Software Foundation. A chosen quote from this essay poses perfectly the problem

Open source is a development methodology; free software is a social movement.

Most people probably aren’t even aware of this difference. I never understood why and how the term open source came to be applied to hardware, government and many other areas when in fact even the English language doesn’t see any notion of source in such contexts.

The article I refer to is concerned about correct definitions, I want to look at some  of the misunderstandings.

There is an angle to this discussion, a lot of people and organisations look to Open Source Software (OSS) in search for cheap (but not cheerful) opportunities to solve their problems.  You can’t blame them for it, but this can raise several issues. I will ignore any moral aspects for now, and focus on a few practical implications.

  • Some individuals or organisations release their work as Open Source with the explicit intention to invite others to contribute to it. This is often an acknowledgement that one’s work can be bettered and perfected if others would gain access and be allowed to contribute.
  • By releasing a work as open source, there is no implicit or explicit guarantee of quality or defect. It just means use it at your own risks, your contribution would be appreciated if only in terms of signalling any defects found, or improvements that you might have been able to add to it.
  • FOSS doesn’t  opposed nor condone gainful use. Statistically however, there exist far fewer people and organisations able to contribute than those who actually use OSS. This is well understood and accepted by most. However, it is astonishing to see some people throwing a tantrum and launching on diatribes when they get frustrated by some open source software. This is just plain crazy behaviour, they not only miss the point and are showing preposterous entitlement that deserves to be frowned at.
  • Increasingly, many organisations are using OSS as a mean for attracting and retaining talent. This is an instance that stretches the notions of free and open in an interesting way, a subtle form of free promotion and marketing.

Article: Why Open Source Misses the Point of Free Software – GNU Project – Free Software Foundation

Marrying Technology and Liberal Arts, an interpretation

To talk about “Marring Technology with Liberal Arts” is to suggest that they would be either divorced, or that they would be fundamentally at odds. Exploring the definition of these terms, one can see that Liberal Arts and Technology are part of the same continuum in human condition. So what is behind such a strong motivation, drive, and potent marketing message? These are the questions that I am trying to understand here.

In this discussion, I want to focus on terms and expressions, and not on the persons or organisations that might have been (or are) behind such terms and expressions. My purpose is to explore, get a start towards a better understanding of the subjects covered.

What is Liberal Art?

Google search brings in a summary from Wikipedia as follows:

The liberal arts are those subjects or skills that in classical antiquity were considered essential for a free person to know in order to take an active part in civic life, something that included participating in public debate, defending oneself in court, serving on juries, and most importantly, military service.

http://en.wikipedia.org/wiki/Liberal_arts_education

In ancient times, not everybody was free – you could argue if somehow that isn’t still the case. Anyway. Liberal arts  wasn’t concerned about making tools and the techniques involved. We could dig further into this, but let’s not. Wikipedia goes a little further and defines modern takes of the expression Liberal Arts as follows:

In modern times, liberal arts education is a term that can be interpreted in different ways. It can refer to certain areas of literature, languages, art history, music history, philosophy, history, mathematics, psychology, and science.[3] It can also refer to studies on a liberal arts degree program. For example, Harvard University offers a Master of Liberal Arts degree, which covers biological and social sciences as well as the humanities.[4] For both interpretations, the term generally refers to matters not relating to the professional, vocational, or technical curricula.

http://en.wikipedia.org/wiki/Liberal_arts_education

There are certainly many other more authoritative sources for such a definition, I leave that the historians. The above is a good enough excerpt for my purpose. Clearly Liberal Arts covers a very large scope of human knowledge and activity.

What is Technology?

Another Google search quickly yields the following definitions:

  • the application of scientific knowledge for practical purposes, especially in industry. “advances in computer technology”
  • machinery and devices developed from scientific knowledge.
    “it will reduce the industry’s ability to spend money on new technology”
  • the branch of knowledge dealing with engineering or applied sciences.

A wikipedia article provides an interesting statement, that points to the earlier uses of the term Technology:

The use of the term “technology” has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and usually referred to the description or study of the useful arts.

http://en.wikipedia.org/wiki/Technology

In this definition, the term “useful arts” comes up. When we talk about use or useful, we are often implying tools and techniques, and technology provides means for us for making tools. Wikipedia defines useful arts as:

Useful art, or useful arts or technics, is concerned with the skills and methods of practical subjects such as manufacture and craftsmanship.

Here we see that technology is more at home with useful arts, than it would be with liberal arts. Mechanisation and automatisation were direct evolution from practices and techniques found in useful arts.

If liberal arts aren’t deemed to be practical, then they must be closer to decorative, entertaining, occupying minds and souls rather than making things for people to use. Liberal Art could be seen as potential uses of things that might be derived from Useful Arts, naturally not limited to such uses. If we keep following this line of thought, then “marrying technology with liberal arts” could be seen as an aim at bringing the practical and impractical closer together. When we say practical, it is often in defence of something that may not be perceived as elegant, intuitive or beautiful. We compromise those traits for usefulness, getting a job done.

Why talk about “marrying technology with liberal arts”?

It is always dangerous to interpret intentions, it is better to explore motivations and interests. From that point of view, one potent motivation could be found in the period in time where the expression initially rose to prominence. There was a time, not long ago, when the people driving technological advances tended to focus more on dehumanising activities in search of increasing financial and material profits. There may even have been geopolitical factors at play, when space exploration and technical ingenuity were being used in chest-beating competition to claim superiority. In such context, the artistic and human sides of Liberal Arts were of less interest because they don’t aim at making things. It goes therefore that the tools and techniques produced by Useful Arts would often be found to be inelegant, unfriendly and ugly. There is certainly plenty of evidence in earlier generations of information technology tools and techniques.

A clear move to humanising what wasn’t, was something for the taking as information technology was rapidly penetrating a widening range of activities in people lives. The emphasis on beauty, elegance, simplicity, all indicate a desire for more artistic expression than actual usefulness. But this could have just as well been marketing gimmick. It wasn’t though, making friendlier, more beautiful tools and techniques actually enhance human experience.

I have only brushed upon some definitions linked with the expression Liberal Art, in an attempt to get back at the origin of the terms and try to understand them a little better. Those advocating the marriage of technology and liberal arts, face numerous challenges and pitfalls. What are these? How can we understand them in the context of current information technology dominance? These topics will be explored in the next installment of this discussion.

The very employable leave their mark well beyond their own office

Tim Bray is leaving Google. He is letting it be known in a marvellous example of humility and thoughtfulness of the scientist penning a dissertation. This is a good way to leave a company.

As a follow-up to my posting on people who rant as they quit a job, I stumbled upon a perfect example of how to do it right. And this is the case for Tim Bray’s post on his leaving Google.

If the name doesn’t ring a bell, check out Wikipedia (Note: do read Wikipedia disclaimers, I only know Tim through his great work that large swathes of the IT industry depend upon daily).

The unemployable typically go out with a bang

Leaving an employment with a bang might be a way to let some steam off, but it also might be limiting a person’s chance of landing the next employment role. Before you post a rant, sleep on it and see if you would still do it the next day.

Every now and then someone departs a job at a well known company then blog about all they’ve seen as evil at now former company.  It usually triggers a flurry of commentary. This is just what happened with a former Apple employee blog posting. This is rarely a wise thing to do, but hey, move fast and break things doesn’t mean there would be no consequences.

While such essays might amuse the gallery, gain the author some form of  ephemeral fame, they may also have an influence on such person’s employability. For example, whenever I hired someone, I often took what they said about their former employers to be a template of what they would eventually say about me and my company. In most cases I don’t hire a person that slags their former company off, it’s rarely a good sign.

People might have, legitimate or not, reasons to rant about former employment. In many cases, it says more about the person than the job they’ve just left. Obviously I don’t know if there’s any legitimate reason for this person to have ranted the way he did. I am just commenting on the act, as a cautionary tale for the would-be hipsters that might be tempted to copy-cat at each opportunity.

After all, if/when some reprehensible activity should be going on at a company, whistleblowers might help bring to light such misdemeanours. That could be ultimately beneficial to the society. But, in all likelihood, for every whistleblowing action there’s probably  dozen of frustrated over-reacting actions.

I wouldn’t go out with a bang if I have hope to land another job somewhere else in the future. There are other ways too. It could be much more productive, while still in employment and actually not fearing of getting fired, to internally vent any frustration one might have. It’s also good to check if the reasons for your frustrations are shared by many or not. If nothing helps then leaving with the good memories is often a better attitude. After all, while at an employment one enjoys some of it and are hoping to help build something up.

If I have a few words for the up and coming professionals: Look for reasons to celebrate something, consider any crap to be the part of unavoidable combustible fuel for moving forward.

Dijkstra on Haskell and Java: the violin shapes the violinist

Ed Dijkstra nailed it when he wrote this (as quoted by the blog post I linked): “it is not only the violin that shapes the violinist, we are all shaped by the tools we train ourselves to use, and in this respect programming languages have a devious influence: they shape our thinking habits.”

I just came across a posting that graciously publishes an essay apparently penned by Dijkstra to the members of his university’s budget council. A favourite take out of mine is the following:

It is not only the violin that shapes the violinist, we are all shaped by the tools we train ourselves to use, and in this respect programming languages have a devious influence: they shape our thinking habits

This is a very good point that can hardly be overstated. What is more, the programming models and the software applications that become dominant actually shape the entire industry for better or for worse. When everybody jump into making photo sharing apps and web sites, or that very large companies fight it off for the supremacy in giving people a place to share photos and thoughts, that’s just a waste of brainpower and resources.

This letter of Dijkstra reminds me something similar I blogged about a few years back, I was sharing thoughts on the impact of adopting programming frameworks. Essentially I had the same concerns in mind, people are shaped by the quality of their learning. Here is my blog posting discussing frameworks.

Dijkstra’s letter, short and sweet.

It is time for a smart mobile device on the go, and wireless peripherals everywhere

The ultimate convenience in personal computing is to carry very little with you and have everything available at hand wherever you may be. To make a rather tangent parallel, when you see celebrities or powerful people travelling they never seem to be carrying anything at all, and that’s because there are small armies of people doing that for them. In computing, that small personal army would be your smartphone or tablet, such that when you actually need access to bulky stuff (printers, projectors, large display) they are available on the premise where you are. I imagine a (very) near future where all you need is a good smartphone on you, and dumb wireless terminals wherever you go. Perhaps not even a smartphone per sé, but a smart device that holds your identity and most personal items so that you can experience them on all nearby by-you authorised peripheral devices

I imagine a (very) near future where all you need is a good smartphone on you, and dumb wireless terminals wherever you go. Perhaps not even a smartphone per sé, but a smart device that holds your identity and most personal items so that you can experience them on all nearby by-you authorised peripheral devices. Here are the signs that point to this.

Smartphones and tablets cost more than your average PC

Don’t take my word for it, look up any online or offline store for PCs. Do the same for smartphones and tablets, compare the prices, you’ll see that they’re very close. So, if you are going to buy a PC, you probably can afford a smart mobile device and you are likely to choose for the latter due to the superior convenience and personalisation.

Smart devices have your most relevant and up to date data

Again, if you’re using one you won’t have any doubt about that. I actually craved for this for years and tried every generation of products that I could afford to buy in my time. I wasn’t nearly satisfied until I got my first iPhone, a 3G model when they first came out. With either a smartphone or a tablet, you have Internet with you and you can access your email and do banking, collaborate on documents.

Main PC uses: browse, store print or share things, play.

A smartphone or tablet can perform very well everything a PC can do, but the converse is not true. It is more convenient to browse the Internet with a smartphone or tablet, than it is to do it with a PC. Beyond that, you can also store things or share things without a PC. In fact, you may be better off storing things in the Cloud than keeping them on a PC. Printing from smartphone and tablet has been relatively elusive until printers started to evolve too. Actually, wherever there is a printer there is also a good functioning PC nearby. So, if you’ve already got a PC it is likely that it works well and you have no reason to upgrade it. Mobile gaming on smartphones and tablets is taking off seriously. There’s not much left that you could be missing.

My 5 year old laptop is still amazing

I wrote a couple of posts about my Macbook Pro in this blog. They are still relevant, it’s still incredibly snappy and robust. I did recently experience a problem, which might actually be an unpublicised bug with OSX Mavericks (previous blog post entry). Since writing that post, the second disk appears to be back online, as if nothing ever happened, and I haven’t lost any data because I didn’t rush to reformat the disk or anything like that. Other than this brief issue with the disk, the laptop remains speedy and responsive. The tasks that my ageing laptop doesn’t cope well with include: deeply technical tasks such as compiling complex toolkits, simultaneously running multiple OS on VM, I don’t do any video processing but I’m sure it will be slower doing that compare to contemporary machines. When you look at these tasks though, you see that you actually want those to run on the Cloud rather than locally. Cloud hosting is no longer just for data centers, it’s becoming attractive for consumers too (see Amazon new GPU announcement).

In closing

The ultimate convenience in personal computing is to carry very little with you and have everything available at hand wherever you may be. To make a rather tangent parallel, when you see celebrities or powerful people travelling they never seem to be carrying anything at all, and that’s because there are small armies of people doing that for them. In computing, that small personal army would be your smartphone or tablet, such that when you actually need access to bulky stuff (printers, projectors, large display) they are available on the premise where you are. A smart device affords everybody a kind of celebrity privilege except for the publicity stuff, and I suspect lots of fashion moves were motivated by the layman’s envying what the privileged few have got.

I dislike hype, it’s almost pathological. This makes me feel like coming up with contrarian arguments in reaction to most articles I read these days that talk about computing technology market. This feeling is reinforced by the impression that many of these analysts are simply posturing with no work whatsoever to back up their writings. This is also seen in many articles on the so-called post-PC era. So if they can get away with it, I feel entitled to risk some thoughts on the subject.

Good essay by Bret Victor, but he’s got one thing wrong in the opening chapter

I read Bret Victor’s latest essay, it’s very good. But he started off by refuting a quote of Alan Perlis which goes like this: “To understand a program, you must become both the machine and the program.”, and Bret thinks this is a mistake. I think Bret got that argument wrong, he is misunderstanding the issue that Perlis is addressing.

I read Bret Victor’s latest essay, it’s very good. But he started off by refuting one argument, which I think he’s got the wrong end of.

In the opening section Bret quotes Alan Perlis as saying that “To understand a program, you must become both the machine and the program.”. Then Bret adds in the same sentence:

This view is a mistake, and it is this widespread and virulent mistake that keeps programming a difficult and obscure art. A person is not a machine, and should not be forced to think like one

Yes, it is quite obvious that people aren’t machines, but that is besides the point. I think Bret got that argument completely wrong for a simple reason: if two persons don’t speak each other’s language, there is no common ground to hold meaningful discourse on. When a person writes a program, they are effectively engaging in a future discourse with a computing environment (computers, operating programs, and everything that defines the operating context of the program being written). So if this person has no idea what his/her writing (code here) could mean to the computer it is destined to run on, then the outcome is uncertain and bad things are more likely to happen. What makes programming difficult, to learn and/or practice, is the fact that people are dealing with infinitely more complexity than they think or assume that they are.

Alan Perlis wasn’t trying to be pedantic, which some people clearly are when they drivel on esoteric things in order to mystify others as a way to show off their skills and expertise. That is not what Alan Perlis was doing, Alan was simply saying that unless someone is truly aware of what  they are getting themselves yourself into, hence the metaphor of embodying those things, that person can be almost certain that bad things will happen. And that, I think was a sound judgment. Perhaps the prose could be different and more elaborate, elegant in the way that Bret himself writes, but that says nothing about the real content.

Learning about programming is difficult because the student (let’s call it that) has no idea what to expect. Some programming languages become popular because they simplify and reduce the amount of concept that the student needs to learn. Other programming environment, C language for example, are complex because virtually on each line one could be easily making several mistakes and it takes a long time to learn them all and become proficient with it.

So, the programming student is typically going to start off on a bias that is determined by the context where the learning is occurring. If the teacher isn’t good, the teaching will suffer and eventually that student will go off and spread misunderstanding further on. The complexity is compounded by all the human factors that typically add up over time. This isn’t anybody’s fault per se. It is a wicked problem, it starts off as a problem of appreciating the true mileage that lies ahead and how one’s own lung and leg capacity would fare in the journey, to use a sporting metaphor. I think that is What Alan Perlis was probably referring to and I think he was spot on. Bret is wrong in this case.

Having said the above, Bret’s essay is quite brilliant, a league above what we can regularly read on the topic. Kudos for his clarity of thought, and the generosity of spreading it out.

If you are into this topic, you would do very well to read Bret’s essay on his blog: http://worrydream.com/LearnableProgramming/

Plagiarism: copy/paste considered harmful to the source. Learn by copying effectively.

Copying may be a natural act, inherent or induced. Copying starts to become reprehensible if it would seem to giving an unfair advantage to an undeserving someone. If you are aware that you are copying someone’s work and you’re not giving appropriate credits, then you’re doing it wrong. If you’re learning from someone and using that learning to come up with something original, you are doing it right.

We learn by copying, we routinely copy nature and our fellow creatures and brag about it with great fanfare. Education systems around the world are institutionalised way of teaching people how to systematically copy someone or something. We buy books in order to copy what is written inside of them. I can go on, but I am just trying to understand something here. We copy without us even noticing it, anything that leaves a lasting impression on us might automatically be copied by our brains without us consciously trying. If we weren’t allowed to copy, learning would practically become impossible.

The act of copying starts to become reprehensible if it would seem to be giving an unfair advantage to an undeserving someone. There are acceptable ways of copying others, and the rest is unacceptable.

Considering the well published cases out there, one wonders where the society at large is going. If you sell fruits on a stall right in front of your house and people seem to be buying them, soon a neighbour will setup a stall in front of their doors and put up fruits for sale. This has been going on for a very long time and the society usually accepts it within some bounds. But we have evolved, become sophisticated at making the most of our ideas, denying that fact is myopic if not naive.

Open source developers and people who blog about programming actively encourage copying their work, they usually ask for some small contribution, attribution is a minimum. I am sure there are tens of thousands of commercial work which have open source material included in them, you usually see in software “About” window how the authors give credit where due. In some instances, when people copy snippets of program from blog posts or derive algorithms from techniques described on web sites, credit may not be prominently visible. What I often see is that somewhere, an acknowledgement is made available. Open source and technical blogging are amongst the most potent advocate of learn-by-copying, in fact the participants wish for network effect to lift their work to higher levels. In this way, you must copy to be part of the game, but as ever, it is really about give and take. So copying is part of the learning.

If you blog or that you write thoughts on any medium that is publicly accessible, there is a high chance that someone may copy you and not consider any form of attribution, because ego or possible financial advantages might be motivating them. In social media, ego also plays out in a somewhat subtle way, individuals with a bit of fame of their own may not want to acknowledge people who aren’t famous, possibly because they may not feel a strong kinship with less famous people. Plagiarism perpetrated by a famous person in this situation might be, if caught, denounced by the community at large, showing their indignation. Such form of copying isn’t also about learning, it may be seen as a form of immoral exploitation of less well-off people.

Copying is often a natural act, inherent to our nature or induced via Learning (capital ‘L’). Intentionally copying others, as you do when you lift material from someone else’s work, will lead to the sort of dispute we see happening now. For example, the person robbing someone else’s property is aware of the prejudice being caused to the subject. The many forms of plagiarism feel that way, that is why plagiarism has nothing to do with learning.

I set out to write my own thoughts here, but I also read a lot of people, so I’m sure I may be copying someone’s ideas in some ways, but I don’t intentionally try to copy material without attribution. It would be flat out wrong if I would copy/paste material from someone else’s medium without acknowledging the source, this is why I sometimes drop a draft If I come across prior work that resembles it too much. I remember a case when a (not brilliant) former colleague of mine asked my thoughts on a subject, then copied and pasted my entire response email content in their blog post without giving me any credit. The guy didn’t think much of it and actually walked around bragging about ‘his great idea’, given how he was presenting himself I knew he just a poor sod to be ignored. You routinely see more prominent bloggers sticking it up to folks who copy them without attribution, Marco Arment was effective doing this on Twitter a short while back. Those are instances where copying have nothing to do with learning.

PS: This post was initially inspired by the current Apple vs. Samsung legal dispute, with no intention to take sides. But, just as I was drafting this post, I saw in my twitter stream, several posts by Matt Gemmell, which seem to be a related case. I would normally withhold publishing this post for fear it’d be seen as link baiting, but I won’t mind because that wasn’t (and isn’t) my intention. I’ll wait a couple of ways before tweeting it, however.

I wish some thoughts leaders would tackle this theme, we have much to learn from discussing the issue.

All Apps are bad: ‘scarenomic’ may be just as harmful as privacy scourging

Yes, please do educate the public on the privacy issues that current social media services are raising. But do so in a measure way, don’t squarely blame every app for trying to steal user information. That is simply not the case.

Can the media tackle any important issue without resorting to hyperboles?

The WSJ piece on Selling You on Facebook, makes an interesting read on privacy issues for the non-initiated, clearly their main target audience. I was going to agree with it wholesale until I realised that the article sweeps too large and makes every app look bad – I mean mobile apps, not Facebook apps which clearly are something else in my opinion.

Yes, it’s true that the general public doesn’t realise the privacy implications of social media. Yes, it’s true that some apps and some companies are abusing the trust implicitly placed in them and taking more than they should. But I disagree with the way the WSJ article seems to be pointing to every app out there, the notion of app itself. That’s not a realistic ways of painting the true picture of what is going on. If that were to be allowed, then you could say the same thing of every human creation that may possibly be put to bad uses. The list would be long, folks wouldn’t feel safe anywhere or at any moment.

I agree that people need to be educated on the privacy issues surrounding social media in general. I disagree with trying to scare people into, perhaps, reading your article. If you try to scare people about every possible thing that could go wrong, then you blur your message and may defeat its purpose. What really helps is giving people self-help clues on what may be happening, and the implications of the specific actions they may be taking online. This should be measured, paced and kept up-to-date. But not a broad sweep because then people are no better off than when they weren’t told anything at all.