OS X Yosemite adaptive networking, a blessing that’s been a curse for my MacBook lately

I experienced quite some frustrations when my computer networking becomes unstable without ever notifying me of any problems. After some trial & error, if found out that the problem occurs whenever I happen to have more than one eligible network within reach. In the end, I had to manually enforce some fixed connections to regain a decent stability.

Apple introduced a neat feature in OS X Yosemite, the computer can automatically switch to the best network it can find without interrupting your programs. They also introduced another feature, OS X randomises the mac address (if useful, the physical network adapter) which should make the computer a little more secure. These have been hailed and are quite useful updates. The first one comes in handy if a network connection would suddenly drop, but the computer is able to reestablish it or hop on another network, for example while downloading a large file, you’d appreciate that the download would just progress to the end, and not get restarted from scratch. That’s a nice time saver. The second feature, the mac address randomiser, helps to prevent that for example coffeeshop wi-fi routers (or malware) could identify someone’s machine. I believe I’ve been at the stick end of these features lately however. After several updates and trying various things, I’ve come to the conclusion that these features are up against me.

I only use Wi-Fi on my laptops for years already. Over the past weeks, I’ve been getting a torrid networking experience whereby my computer would intermittently lose network connection without notifying me of any problems. I’d check and see full strength Wi-Fi signal and that I am still connected to the network, yet none of the applications that I am running is able to reach the outside world on the Internet.  Without me doing anything, the Internet connection comes alive again and I can do a few things, then it would drop once again. The pattern repeats many times. At first I didn’t think much of it, but quickly grew annoyed and set out to resolve the issue. After scouring forums, stack overflows and other random sources, in vain, I was on my own. What eventually brought stability back to my networking was this:

  • Let the computer look for Wi-Fi network
  • Once it establishes a connection, turn off the automatic network detection
  • Delete any other known network in my vicinity, that was already registered by my computer.
MBP OSX Yosemite don't ask for networksMBP OSX Yosemite don’t ask for networks

After I’ve done this, I get a stable connection. But at home, it turns out I have one more complication. My UPC (Ziggo) subscription includes a wi-fi router, but I also have Apple Airport Express. When both are activated, my computer detects two known wi-fi  networks, so it will start hoping back and forth between the two without telling me. So I end up in the same situation, can’t get on the Internet and for no apparent reason. To resolve this, I’ve turned off the Ziggo wi-fi router.

What I think may be happening, is the following. The computer detects a Wi-Fi network, requests and obtains an address, thus can get on the network. But shortly afterwards, it detects another network with perhaps a slight but (intermittent) better signal strength, it would hop on to that new one. Wi-Fi being a radio signal, the vacillations of the two (or more) signals cause the computer to keep jumping around. When this is combined with the randomised mac address (physical address) allocation, the Wi-Fi routers would then temporarily quarantine the computer before allowing it back in. The user (me!) then experiences that the computer simply loses any network connectivity for reasonably long spells of time, 3 to 5 minutes, then inexplicably regain it again. It would then loops the back into the same game. On and on. This is what I think has been going on with my MBP, and that’s why I decided to try forcing a semi-manual network setup, so essentially try and stop it being too clever.

Adobe Slate, an attractive tool for publishing nicely laid out content

It is time to revise a past blog post where I talked about missed opportunity for word processors. With Adobe Slate, it is fair to say that iPad users can now easily publish nicely laid out content.  I tried it briefly, it’s effortless to start with, all you need is some content. Well, I think this should have always been possible with the word processors that have been on the market all this time.

There is one drawback to Adobe Slate though, it requires an account with Adobe Cloud. I understand the rationale, but this, to me, is an unnecessary barrier to adoption.

Adobe Slate web site.

 

An open source port scanner that is helping some bad guys

Port scanning is one of those annoying activities that the bad guys may use while attempting to try and find back doors on systems. The principle is simple, find out what ports a system has left open, if you recognise any, try a dictionary like attack on these ports. All it takes is a simple bot.

Last few months, I have noticed multiple port scan attacks at my web sites from a user agent “masscan/1.0″. I dug a little and found this to be coming from an open source tool, the project on Github:

robertdavidgraham/masscan

So, it seems that some people have found this tool, and are now randomly targeting web sites with it. To what aim, I can’t tell for sure. It is certainly reprehensible to be poking at someone’s doors without their consent, everybody knows this.

I’ve also noticed lots of attempts to run PHP scripts, they seem to be looking for PhpMyAdmin. Fortunately I don’t run anything with PHP. If I did, I would harden it significantly and have it permanently monitored for possible attacks.

Most of the attacks on my web sites originate from, in this order: China, Ukraine, Russia, Poland, Romania, and occasionally, the US.

You don’t need anything sophisticated to detect these kind of attacks, your web server log is an obvious place. Putting a firewall in place is a no-brainer, just block everything except normal web site http and https traffics. You can invest also more in tools, then the question is if you’re not better off just hosting at a well known provider.

This is just one instance, and there are infinitely many, where even the dumbest criminals are getting their hands on tools to try and break into systems. Cloud hosting are getting cheaper all the time, soon it will cost nothing to host some program that can wander about the Internet unfettered. Proportionally, it is getting exponentially easier to attack web sites, while at the same time, it is getting an order of magnitude higher to keep sites secure.

I do see a shimmering light, container technologies provide for a perfect throwable computing experience. Just start a container, keep it stateless, carry out some tasks, when done, throw it away. Just like that. This may reduce the exposure in some cases, it won’t be sustainable for providing an on-going long-running service.

IT security is a never ending quest that is best left to dedicated professionals. I am just casually checking these web sites that I run. At the moment, I haven’t deployed any sensitive data on these sites yet. When I do, I will make sure they are super hardened and manned properly, likely a SaaS provider rather than spending my time dealing with this.

Nice post: “Architecture in Context – Part 1″, By Charlie Alfred and Gene Hughson

Architecture in Context – Part 1 | Form Follows Function.

It’s really important to remind folks about context, this is a great take at it. The multidimensional aspect could easily be overlooked. Namely, the environment, time continuum, these bring about constraints that help to frame the context, and usually stakeholders don’t have any control over these other dimensions. When some of this is missing, endeavours may become disconnected from reality.

In my opinion, this isn’t stated often enough, instead a lot of debate tend to centre around definitions and the latest-fashion.

Handy shortcut to keep WiFi running on OS Yosemite: restart the DNS resolver

A long time ago, while I was studying, I had a PC running Microsoft Windows 2 (yes indeed, Windows version 2). It came with a program called Write, which I was using to type my homework and eventually my graduation assignment. This thing was unstable, it crashed so often that I learned to press CTRL+S at the end of every line of text that I typed on it, to be sure that I didn’t lose my work. The habit never left me. It wasn’t until about 4 or 5 years ago, long after I had already switched to Mac and didn’t need to worry about CTRL+S, that I finally lost the habit of instinctively hitting that key combination every few minutes.

I have an annoying issue with my Mac, it just randomly loses network connection, sometimes it won’t connect at all for  a few minutes. After a couple of updates that promised to have fixed the issue, it’s still there. I made this shortcut, very short bash scripts that I placed in my .bash_profile startup script.

$ alias down-discoveryd='sudo launchctl unload -w /System/Library/LaunchDaemons/com.apple.discoveryd.plist'
$ alias up-discoveryd='sudo launchctl load -w /System/Library/LaunchDaemons/com.apple.discoveryd.plist'
$ alias restart-network='down-discoveryd && sleep 3 && up-discoveryd'

One line would have been enough, but I wanted it a little pretty, so I made three. I added a small delay, for good measure, though I think it could also be omitted. The only command I need to run is the last one, restart-network, I get prompted for the admin password, and the service is restarted. If the network is still not restored, I run it again, and again. After 2 or 3 attempts, I get my network connection back and I can continue working.

I find myself using this shortcut very frequently. It has become my new CTRL+S. Unfortunately.

Cloud and Mobile, an incredible if not unprecedented opportunity for comeback kids

It cannot have escaped many software company executives that Mobile and Cloud represent possibly the best opportunity to reinvent a flagging business proposition. Surveys and studies by many household name analysts such as McKinsey, Deloitte and others, show that . My post comes from another angle, a simple observation at a micro level that can be scaled up to a bigger picture.

Actually, I believe that Mobile without Cloud, or vice versa, would not have any significant impact. So, I contend that whenever Cloud is mentioned, Mobile is an implied companion. And the other way around as well. When Internet Of Thins (IoT) finally dawn, Ubiquity would become a more appropriate concept and could encompass Mobile.

I was recently reminded of one pervasive reality with software, dependencies are the cause of a lot of troubles, if not most. I was trying to install a new version of Archi, a software tool I am used to running but hadn’t needed for a while. When it failed to run, I thought I could quickly just rebuild it. I fired up Eclipse and started going through the process of building Archi. I quickly stopped, realising the trap I was slowly getting myself dragged into. What the industry term as legacy is largely due to such dependencies, in many situations, an insurmountable task of rewriting and rewiring code that is large, complex, hard to understand, better left alone as long as it is working. In the case of Archi, I am not suggesting that any rewrite would be necessary, just the dependencies that I would have to deal with before I could achieve my goal, get it up and running.

As a quick comparison, I take the recently famous move by Microsoft, who have turned to open source in a serious way.

rebuilding a complete software stackFull Stack: open source stack vs. Microsoft’s

In the diagram, I name an arbitrary number of levels, or layers, just for this discussion’s sake.

  • Level 0: this is usually provided by hardware vendors, who make the hardware an then bundle it with software.
  • Level 1: this is the Operating System level, in the comparison, Microsoft provides this element, this is Windows for example.
  • Level 2: fundamental building block elements, usually provided by the OS vendor. This is one area where Microsoft open sourced .NET Runtime Engine for example.
  • Level 3: combines with Level 2 and make it easy to expose system capabilities to users. Level 2 and Level 3 tend to be closely integrated, often case, 3rd party vendors provide Level3 elements.
  • Level 4: this is the part that the user will eventually experience.

In this discussion, I’ve intentionally left out Apple, as they are known to control the end-to-end user experience in most cases. Albeit, Apple do subcontract for parts and components, embed 3rd party software and open source software, but they control the packaging process. It is not an interesting study for this post, I focus on Microsoft stack vs. open source stack.

I’ve simplified the reality in this diagram quite a bit. But there usually are multiple layers between Level 2 and Level 3. This is where most of the dependency troubles lie. For a developer, this could be a tool chain for example. But it could also be libraries, utilities and other 3rd party software elements. This is where trouble with multiple Windows DLL versions typically occur. Such dependency issues are not confined to Windows, I’ve seen it with many open source Linux software too. The issue happens when for example you are dealing with a Level 3 component and it complains about some Level 2 element that you were not aware of or didn’t expect to be dealing with. That’s the can of worms.

By moving important elements to open source, Microsoft would normally have had to do a lot of work in Level 2 and Level 3, to bring up their open source proposition to the same level where Unix/Linux is at. But, with Cloud and Mobile, this work is not necessary. A wide variety of platforms and legacy systems can be skipped. This is also perhaps the best opportunity for Microsoft to actually ditch support for a lot of systems that won’t need to be directly exposed to the user.

Cloud providers and infrastructure providers offer solutions that combine Level 0 and Level 1. Lots of players have emerged in recent years, providing up to Level 3 either by themselves or by procuring Level 0 and Level 1 from others. Large players like Amazon, Microsoft, and Google provide up to Level 3. In some instances, they offer solutions all the way up to the end user level. Examples abound, I don’t want to extend this discussion too much.

Ditching Level 3 elements and rewiring your Level 2 elements for Mobile and Cloud, that is where the opportunity mainly lies. The necessity of revising and rewiring systems should be a profitable source of business for those who can can into it.

Unfortunately, companies that cannot figure out exactly what is going on in their business at its core, will struggle to even see the opportunity. The situation is akin to old Hollywood blockbusters movies such as Die Hard for example, where it was important to cut the right cable. The comeback opportunity is about separating out what should be dropped from what should be kept, then streamlining the business for Mobile and Cloud. Newcomers who never even built such Level 2 and Level 3 elements, have the freedom to move swiftly along with the market.