Any shared links via Prismatic URL shortener will be broken after 20 December: Prismatic is shutting down

I saw this article over the week-end. I’ve used the service a lot without really noticing it. Via the attached post, they announce that the site is to close down, taking down the mobile apps in the process. It’s sad to see them go. The reasons invoked are familiar enough, that’s how it goes. I was never clear what their business model was. I wonder if Flipboard might be next. We’ll see.

One small detail I noticed, which will likely bite many people, when Prismatic goes down then articles shared via links generated by Prismatic apps and web sites will likely be broken too. The culprit is, url shorterner. I’ve quickly converted the ones that I’d saved in my notebook, but I’m sure I’ve missed some. It’s not easy to find all of them. This is one of the problems with url shorteners, they are not proper permalinks.

Source: Writing the next chapter for Prismatic.

An open source port scanner that is helping some bad guys

An open source port scanning tool, masscan, is being used to repeatedly attack my web sites. There are certainly lots of such tools widely available that even criminals with no skills can get their hands on and randomly try to break sites. IT security is a never ending quest that is best left to dedicated professionals.

Port scanning is one of those annoying activities that the bad guys may use while attempting to try and find back doors on systems. The principle is simple, find out what ports a system has left open, if you recognise any, try a dictionary like attack on these ports. All it takes is a simple bot.

Last few months, I have noticed multiple port scan attacks at my web sites from a user agent “masscan/1.0”. I dug a little and found this to be coming from an open source tool, the project on Github:

robertdavidgraham/masscan

So, it seems that some people have found this tool, and are now randomly targeting web sites with it. To what aim, I can’t tell for sure. It is certainly reprehensible to be poking at someone’s doors without their consent, everybody knows this.

I’ve also noticed lots of attempts to run PHP scripts, they seem to be looking for PhpMyAdmin. Fortunately I don’t run anything with PHP. If I did, I would harden it significantly and have it permanently monitored for possible attacks.

Most of the attacks on my web sites originate from, in this order: China, Ukraine, Russia, Poland, Romania, and occasionally, the US.

You don’t need anything sophisticated to detect these kind of attacks, your web server log is an obvious place. Putting a firewall in place is a no-brainer, just block everything except normal web site http and https traffics. You can invest also more in tools, then the question is if you’re not better off just hosting at a well known provider.

This is just one instance, and there are infinitely many, where even the dumbest criminals are getting their hands on tools to try and break into systems. Cloud hosting are getting cheaper all the time, soon it will cost nothing to host some program that can wander about the Internet unfettered. Proportionally, it is getting exponentially easier to attack web sites, while at the same time, it is getting an order of magnitude higher to keep sites secure.

I do see a shimmering light, container technologies provide for a perfect throwable computing experience. Just start a container, keep it stateless, carry out some tasks, when done, throw it away. Just like that. This may reduce the exposure in some cases, it won’t be sustainable for providing an on-going long-running service.

IT security is a never ending quest that is best left to dedicated professionals. I am just casually checking these web sites that I run. At the moment, I haven’t deployed any sensitive data on these sites yet. When I do, I will make sure they are super hardened and manned properly, likely a SaaS provider rather than spending my time dealing with this.

DYI: Improve your home Wi-Fi performance and reliability.

Does your Wi-Fi feels much slower than you expect it to be? Is it typically slow at the times when your neighbours are also at home, just after work for example, on weekends? Does it perform better when most of your housemates or neighbours are asleep? In the affirmative in some of these questions, you may be experiencing Wi-Fi channel overcrowding. If channel overcrowding is the issue, you can solve it easily by moving your wi-fi router to a quieter channel.

If, like me, you eventually got annoyed by how choppy and frankly slow your home Wi-Fi can become, then this post may be for you.

Does your Wi-Fi feels much slower than you expect it to be? Is it typically slow at the times when your neighbours are also at home, just after work for example, on weekends? Does it perform better when most of your neighbours are asleep? In the affirmative in some of these questions, you may be experiencing Wi-Fi channel overcrowding.

If channel overcrowding is the issue, here is a simple way to solve it:

  1. Identify a channel that is less busy or not at all
  2. Configure your router to use this free channel
  3. Restart it. Et voila.

Here is how a crowded channel looks like

wi-fi network channels

 

In the picture, you see a lot of coloured areas between 1-8 (numbers at the bottom of the picture), these are wi-fi channel numbers. If your router’s name shows in this area you are likely to have a poor reception. The solution is to move to a quieter channel. In the picture, you see that channels above 108 are totally free. Look at your wi-fi router documentation, look for the word “channel” and find out you to change it. If your router is not able to use the desired channel, it may be because it’s too old for example. If you find that all the channels in your area are too busy then you are out of luck, as far as this tip is concerned.

Tools

In my example, I am using a Mac and and I found a program in Apple Store called WiFi Explorer. The picture in this post is a screenshot from this app. If you are using a PC, I am sure there would be an equivalent available somewhere – be safe though, free software from an obscure or unchecked source may be malware. If this kind of tool isn’t something for you, then the good old trial & error method might be a way to do it. Just set your Wi-Fi router to another channel, restart it, check how it performs. Repeat the process until you stumble into a channel that works for you.

 

Background

In residential areas, most of your neighbours also have wi-fi at home. More often than not, you may just drowning out each other’s wi-fi network with unwanted noise. Think of it this way, try to have a cozy conversation with someone in the middle of a busy shopping street and the two of you are 15 meters apart. What’s more, people have different voices that can be heard, the situation is different with wi-fi routers.

People get wi-fi at home either by directly obtaining it from their cable internet provider, or by buying wi-fi routers and installing that themselves on top of their wired Internet connection. It’s easy to do. The hope is that it’s going to be super fast as your provider relentlessly has been touting. At some point, a reality slowly start to sink in, your Internet connection is nowhere near as fast as you were told. Then follows annoyance, frustration, a little bit of guilty denial and ultimately resignation to accept the status quo – so-called Stockholm Syndrome.

A common cause of sluggish and fluctuating wi-fi network performance is the overcrowding of the allotted channels. Since most neighbours are likely to get wi-fi from the same providers, they all get the same routers with the exact same default settings. This means that the routers will be constantly interfering, causing noise that slows down and sometimes interrupt your network.

Aside

Although I am talking about possible Wi-fi interference here, it is by far not the only reason behind sluggish or unreliable home network. Another culprit may be typically deception from cable providers. It is well documented that residential Internet connection providers frequently throttle their users bandwidth, it’s easy to find out. And naturally, when lots of people are on the Internet, for example following an event, networks could become slower for everybody for a certain amount of time. It’s all more complicated than we often make of it. This is just a tip that may help if you are lucky to be meeting the conditions that I mentioned.

Cyber security white elephant. You are doing it wrong.

The top cause of information security weakness is terrible user experience design. The second cause of information security vulnerability is the fallacy of information security. A system designed by people will eventually be defeated by another system designed by people. This short essay is just a reminder, perhaps to myself as well, that if certain undesired issues don’t appear to be going away, we may not be approaching them the right way.

The top cause of information security weakness is terrible user experience design. The second cause of information security vulnerability is the fallacy of information security. Everything else flows from there.

It happened again, a high profile IT system was hacked and the wrong information was made available to the wrong people in the wrong places at the wrong time. This is the opposite of what information security infrastructure is defined as. Just search the Internet for “Sony hack”, one of the top hits is an article from 2011 that reads as Sony Hacked Again; 25 Million Entertainment Users’ Info at Risk

This type of headline news always beg some questions:

Don’t we know that any information system can eventually be breached given the right amount of resources (resource includes any amount and combination of: motivation, hardware, people, skills, money, time, etc) ?

Is it really so that deep pocketed companies cannot afford to protect their valuable resources a little harder than they appear to be doing?

Do we really believe that hackers wait to be told about potential vulnerabilities before they would attempt to break in? Can we possibly be that naive?

This has happened many times before to many large companies. Realistically, this will continue to happen. A system designed by people will eventually be defeated by another system designed by people. Whether the same would hold true for machine designed systems versus people designed systems is an undetermined problem. Once machines can design themselves totally independent of any human action, perhaps then information security will take new shape and life. But that might just be what professor Steven Hawkins was on about recently?

I suggest that we stop pretending that we can make totally secure systems. We have proven that we can’t. If however, we accept the fallibility of our designs and would take proactive actions to prepare, practice drills and fine tune failure remediation strategies, then we can start getting places. This is not news either, people have been doing that for millennia in various capacities and organisations, war games and military drills are obvious examples. We couldn’t be thinking that IT would be exempt of such proven practices, could we?

We may be too often busy keeping up appearances, rather than doing truly useful things. There is way too much distraction around, too many buzzwords and trends to catch up with, too much money to be thrown at infrastructure security gears and too little time getting such gears to actually fit the people and contexts they are supposed to protect. Every freshman (freshwoman, if that term exists) knows about the weakest link in information security. If we know it, and we’ve known it for long enough, then, by design, it should no longer be an issue.

It’s very easy to sit in a comfortable chair and criticise one’s peers or colleagues. That’s not my remit here, I have profound respect for anyone who’s doing something. It’s equally vain to just shrug off and pretend that it can only happen to others. One day, you will eventually be the “other”. You feel for all those impacted by such a breach, although there are evidently far more important and painful issues in the world at the moment to be worrying about something like this particular breach of confidentiality.

This short essay is just a reminder, perhaps to myself as well, that if certain undesired technology issues don’t appear to be going away, we may not be approaching them the right way. Granted the news reporting isn’t always up to scratch, we do regularly learn that some very simple practices could have prevented the issues that get reported.

 

Great New Yorker article ‘The Group That Rules the Web’

The New Yorker article, The Group That Rules the Web, is a comprehensive overview of the history, organisation and processes around web technology standardisation. By some coincidence, in my last blog post I mention web technology without going into any details. This article is an actual journalist piece on web technology. It’s a good read for anyone interested in the subject.

This is a  a comprehensive overview of the history, organisation and processes around web technology standardisation. By some coincidence, in my last blog post I mention web technology without going into any details. This article is an actual journalist piece on web technology. It’s a good read for anyone interested in the subject.

New Yorker article: The Group That Rules the Web.

Trying to oppose Open Web to Native App Technologies is so misguided that it beggars belief

Open web is not in opposition of native apps. There is just one single entity called Web, and it is open by definition. There are numerous forms of applications that make use of Internet technologies, a subset of such Internet-based technologies serve the web. You get nowhere trying to oppose those two things.

People who ought to know better spend valuable energy dissing native app technologies. I’ve ignored the fracas for a long time, finally I thought I’d pen a couple of simple thoughts. I got motivated to do this upon seeing Roy Fielding’s tweets. I also read @daringfireball’s recent blog post explaining in his usually articulate way how he sees Apple in the whole picture. Here, I just want to look at a couple of definitions, pure and simple.

To claim that open web technologies would be the only safe bet is akin to saying that all you would ever need is a hammer. Good luck with that if you never come across any nails.

I think both are very good and very useful in their own right, one isn’t going to push the other into irrelevance anytime soon because they serve different use cases. The signs actually are that web addressing model, by way of URIs, might get challenged soon once Internet connected devices start to proliferate (as they’ve been hyped to do for some time now). So, I actually think that a safer bet would be Internet Technologies, Web could increasingly be pushed into a niche. But ok, I actually didn’t intend to get in the fray.

Incidentally, I only see claims that native platform apps would be some kind of conspiracy to lock users down, but apparently open web would be the gentlemen benevolent dictator’s choice. I am skeptical in the face of such claims, because Mother Teresa wasn’t a web advocate for example. I remember that Apple, who triggered what-you-know, initially only wanted people to write HTML apps for the iPhone when it launched. It was only when they got pressured by developers that they finally released native SDKs. That’s such a terrible showing for a would-be mal-intended user-locker.

There is no such thing as an open web versus anything else. There is just the web and then there is an ever growing generation of native platform apps that also take advantage of Internet technologies. That’s it. Trying to oppose those two things is just rubbish. Plainly put. If something is a web technology and actually adheres to the web definition, it can only be open. A close web technology would be an oxymoron, the definition doesn’t hold. If something is built using Internet technology, it may borrow web technology for some purposes, but it is not part of the web per se.

In the instances where web technology is best fit, it will continue to be that way and get better. Conversely, in the  cases where native platform apps work best, those will continue to get better and may use the Internet. There is a finite number of web technologies bound by the standards and specifications implemented. But, there is an infinite number of native app technologies since the implementers can write anything they like and get it translated to machine code following any protocol they may devise.

The Web doesn’t equate to the Internet. There are open and closed Internet Technologies, but there isn’t such a thing for the Web. The Internet contains the Web, not the other way around.

In the outset, there is a platform battle going on where parties are vying for depleted user attention. In such battles, every party attracting people to their platform is self serving and can make as much morale (or whatever you call it) claim as any other. The only exception are those set to gain nothing in getting their platform adopted. There aren’t any of those around.

My observation of the ongoing discussions so far is simple. Some individuals grew increasingly insecure of having missed the boat on native platform apps. Whatever the reason, including own choices. Some other individuals, a different group, burned their fingers trying to follow some hypes, learned from that and turned to native platform apps. The two groups started having a go at each other. That got all the rousing started, everyone lost their cool and indulged in mud slinging. But there is no point to all of this.

If you are building software intended to solve a category of problems, then the most important technology selection criteria you should consider is ‘fit for purpose’. Next to that, if the problem is user bound, then you want to find the most user friendly way of achieving your aim. If you do this correctly, you will invariably pick what serves best the most valuable use cases of your intended audience. Sometimes this will work well with web technologies, sometimes it won’t, and other times you will mix and match as appropriate.

I don’t see web technologies being opposed to native app platforms, at all. Whatever developers find more attractive and profitable will eventually prevail, and use case is the only metric that truly matters. It is understandable that any vendor should vie for relevance. That’s what’s at stake, and that is important to everyone. It’s only once people and organisation would face up to this cold fact and start talking genuinely that they begin to make some progress.

 

Web is 25: word processors missed chance, static web content authoring with templates

Web is 25: word processors missed chance, static web content authoring with templates. By static web content authoring with templates, I mean: the ability to define content elements and their styling with only HTML, CSS and perhaps JavaScript, and be able to author the content elements and expect the tool to manage everything as a unit.

The web’s just celebrated its 25th anniversary. With that you’d think we’d be far along with the tools and techniques for authoring web content. You’d be right and wrong at the same time. One area is the ability to author web content based on templates.

The issue

By static web content authoring with templates, I mean: the ability to define content elements and their styling with only HTML, CSS and perhaps JavaScript, and be able to author the content elements and expect the tool to manage everything as a unit. This is what you can achieve using web content management (WCM) products. But WCM products are born just for the web,  they might sport rudimentary word processing functionality but that’s not their real purpose.

Yes, web content authoring has been possible for a long time via various means, in various degrees of coherence and complexity. However, the leading word processing software packages just never managed to provide a good way to author web content. What is the state of affairs there?

Vendor Attempts

There were attempts to do this in Microsoft Office. It would produce html formatted content for direct publishing on the web. Did you ever look at the source code that it generated? Possibly one of the most complex and opaque there ever were. It had so many dependencies and proprietary tidbits that you couldn’t consume it on non-Windows machines.

Another approach I saw was for example a product like EMC Documentum, which would embed Microsoft VBA code in Microsoft Office document templates and use ActiveX controls to communicate with the back-end system. This looked clever on the surface but it had numerous problems.  The first one was, it only worked on Windows. The second issue, the local versions of a document would quickly and frequently go out of sync with the server versions, and resolving such issue was tedious. The last but biggest drawback was that it treated Microsoft Office documents like opaque files with some metadata, so the web templates and the office templates had nothing to do with one another. This wasn’t an optimal solution, it just allowed people to somehow share and version Office documents.

Microsoft eventually came up with SharePoint, that is such a different beast that it is a category of its own. I have also experienced situations where SharePoint local views of a document piece would go out of sync with the server version, a nightmare to fix as I experienced – though, I must add that I stopped at SharePoint 2010, I haven’t used the recent ones which could have solved such issues. Again, my point here is that, SharePoint isn’t a word processing tool, it’s something else altogether.

Let’s consider LibreOffice and Open Office powered approaches, these sibling products provide a way of programmatically manipulating documents without requiring visible User Interface (UI) elements. This made them great for mail merging and versatile document outputting combined with web technologies. Even so, they don’t help much with decent template based web authoring.

Then there’s Apple Pages software. It’s hopeless with web authoring, it doesn’t support it as far as I known. However, Apple has recently come up with iCloud offering it’s office productivity functionality as a pure web browser based solution. It looks promising, it may make Apple’s products more pervasive. But this still falls short of what I am looking for, what I thought would be a genuine user need, the ability to author web content based on a template built on HTML, CSS and JavaScript standards only.

Market dynamics

There came a point where the market was split, one one side you had the traditional office document management system packages (DMS), on the other side, the pure web content management (WCM) packages. Each side of this rift was very good at what it was built for, and did poorly at the other part of the equation. So as is often the case, vendors started expanding their products, DMS products started adding more and more web authoring capabilities, and WCM products started to provide DMS functionality too. The result is a mixed blessing where not many have done as well as could. Eventually Social Media (SM) entered the scene, everybody scrambled to redraw their blueprints and product marketing.

Status Quo

I am tempted to think that the word processing software vendors were caught napping, they never truly understood how to serve the web needs. It’s as though word processing product managers never spent a single day contemplating web authoring. You’d have thought the web to be the most important market phenomena in the IT industry, wouldn’t you? If you’d have millions of people using word processors daily, a large number of those have a need to publish content on the web, making lives easier could surely add value. Nope, the vendors missed that. It might have been a case of the old pony with new tricks.

Summary

I didn’t intend to talk about content management, or web content managent or web content distribution as a main subject. That would take an entire and long blog post of its own. I just wanted to look at how the word processing vendors fared as the world wide web dawned upon us. What I have seen and experienced tells me that the vendors missed the boat. And, if we consider who the dominant players are in the word processing market, it becomes glaring that Microsoft might have let this opportunity slip up. I’ve not seen any analyst mention this so far, they all seem to be focused on mobile and tablet as the only significant disrupting.

OpenID Connect and browser redirect, the web should get over its HTML hangover

OpenID Connect would have been superb without the annoying notion of html redirect. I despised OAuth2 for that, I jumped on the specifications of OpenID Connect thinking it would have a good answer, and I wasn’t pleased to see that it’s still there. So either vendors have an interest in keeping things that way, or there was an oversight. The latter is implausible, so it’s got to be the former.

When I saw the OpenID Connect announcement my hope just went up that, finally, OAuth2 would be getting a decent replacement and it’s annoying web browser logic would go away. Nope, it seems that’s not the case at all, so I came quickly back down to earth and needed to get this out of my system.

I started reading up OpendID Connect specifications. It looked promising until I got to the point where it mentions redirect URI (section “3.1.2.1. Authentication Request” ), there I froze, shock and horror! I don’t get it, why would a 2014 web single sign-on standard specification have such a narrow focus.

When writing a modern web application, if architected properly, one certainly would have completely separate notions of visual and non-visual elements. A web solution that isn’t composable isn’t future-proof and is doomed to quick obsolescence. Yes, sure, the web picked up thanks to HTML and HTTP. But the Internet was there long before all that and, we’re surely heading towards an Internet where a lot of chatty stuff isn’t going to surface to a user until at the very last moment, at a final consumption point. Issues such as identification and data access are to be resolved well before anything is ready to be made visual. Authentication and authorisation are not visualisation problems, they are data access concerns. Data can and should be manipulated in a composable manner, until it’s finally rendered. There should never be any assumption about visualisation in the guts of non-visual elements. Visual elements should be calling on the non-visual elements, not the other way around. That’s how, most probably, the Internet Of Things and any great stuff looming in the horizon would be architected. In this context, I don’t get the reasoning behind tying OpenID Connect to things like browser redirects.

So, OpenID Connect rings quite a few good tones, but it didn’t seem ambitious enough for me to fully empower the next generation Internet solutions. In many ways, it looks like a vendor toy that would be great for tool vendors but developers would need to figure out ways to make it even more usable. That’s a shame, a missed opportunity.

Windows strength was the distribution model, people never loved it per se

Microsoft’s early masterstroke was to have locked down the distribution. Once they succeeded in that, it was easier for them to push their products to users. With the change in the distribution model, Windows strongest advantage started to wither, resulting in the current (identity?) crisis it faces. People got Windows while looking for PCs, they didn’t care about Windows otherwise. OEMs cared about Windows, mostly because they’d not be successful in the market. There was a time when relevance was driven by the distribution model, mostly powered by the OEMs. The Internet provided the first major blow to that model. The rise in mobile computing on smart devices provided a second major blow to the OEM model.

Microsoft’s early masterstroke was to have locked down the distribution. Once they succeeded in that, it was easier for them to push their products to users. With the change in the distribution model, Windows strongest advantage started to wither, resulting in the current (identity?) crisis  it faces.

People wanted personal computers, and those almost universally shipped with Windows, therefore people got Windows by default. Microsoft’s deals with IBM and the OEMs were the reason for this. Since everyone was getting Windows, developers had to target it. In the early days, developing for Ms-DOS and Windows, was much easier and affordable compared to other platforms, this resulted in popular software being available only on Ms-DOS and Windows. Therefore, people created on Windows,  users needed PCs running Windows in order to consume such creations. There was no specific love for Windows pushing buyers to it.

The Internet popularised new ways of using computers that Microsoft didn’t control. Apple helped to popularise new computing experiences and devices that Microsoft didn’t control. The combined force of these major shifts resulted in the emergence of new powerful distribution models, challenging the established OEM model.

Apple created an enviable model that allows them to ship desirable products successfully and repeatedly. Microsoft must have decided to follow a similar path, that could explain why they didn’t hesitate to upset their OEMs, their single biggest force in the marketplace.

Google saw a chance in becoming the world’s Internet proxy, Microsoft woke up to that much later. It would have been a hard sell anyway trying to position Windows there. Even in the data center, Windows has a chance to be present but no chance to becoming the dominant force.

Windows relevance challenge is now mostly a problem for Microsoft, and to some extent the millions of people whose skills and experience would be deprecated should Windows falter considerably. Ironically, Microsoft alienated OEMs, their former virtual Chief Of Growth. Microsoft also seems to be well on the way of alienating users too – force feeding Metro (Modern) UI to legions of users is a sure way of asking them to try something else.

Microsoft virtually dropping their OEMs is a risky bid, it was what made them in the first place and continued to carry them for decades. Somehow, Apple dropping skeuomorphism is a similarly dangerous move, if that sort of emotion and empathy disappears from Apple experience users would start to see fewer and fewer differentiation in the experience. Talking about ‘the platform you love’ would increasingly sound delusional rather than an actual reflection of the market reality.