Platform Indifference

In 25 years, I’ve used a lot of technology. In fact, more than I can remember without really sitting down to think about it. It’s curious, then, when the rare and obsolete question of “Mac or PC” comes up.

In these many years, I’ve seen a lot of different operating systems for different computer systems. It’s a very myopic focus when one engages in “Mac or PC” debate. After all, these two are only relevant in the consumer sphere.

There are many operating systems out there that control each and every device imaginable. For some time now, most devices that we have in our homes and offices have an underlying operating system of some kind. There are so many now. I won’t even try to catalog the range of what I’ve seen (and suspect without forcing a fatal reveal). You blender, echoDot, Firestick, computer, internet router – just about everything – probably has an underlying OS that runs the device.

In early days, there were few of these OS’s – probably because they required so much hardware and resources to run. Over time, many of these OS’s have been optimized to run on very little hardware (or virtual resources) so that they can run very small workloads with more ubiquity across the broad spectrum of devices.

Desktop computing is in a weird and stagnant place. It’s not growing and it’s not really shrinking. So when someone makes the leap from PC to Mac or vice-versa for their desktop computer, it’s frankly no longer noteworthy. It’s akin to asking if you like vanilla or chocolate now – whereas even 5 years ago, it was a debate that was only won with “I have no choice” and largely was a debate over which form of torture you were able to endure best.

Fewer and fewer technologies are relying solely on desktop technology to be relevant. Sure, it’s early days for this transition, but we know that mobile is making huge inroads – most notably in our personal lives. Any software company worth their salts is taking notice and either developing for mobile or at the very least considering it’s place on their roadmap.

The more interesting and provocative debate today is browser and mobile platforms. These are debates that the average Joe can engage in now more readily. I suspect few are interested, however, as there’s more interesting fodder to debate elsewhere in the consumer distraction dome.

So, the fact that I have switched over to Mac for my personal and work computing is irrelevant to most. Who cares? I could use either platform. I don’t know MacOS as well. Perhaps that’s why I switched – it was time for a new challenge to go along with my new job.

Migrating File Servers to Dropbox, Box, Google Drive or any other Cloud Storage Service

Some of my customers love their Dropbox/Box/Sharefile/Sharepoint/OneDrive as a file server replacement.

Others, not so much and have gone back to their trusty file servers after a failed leap.

Many would like to have it, if it weren’t for the other issues that it introduces in more complex environments.

It’s easy to believe the hype that offices with less than 10 employees can be moved to cloud storage easily. Sadly, I’ve got multiple offices of 3 users that can’t use cloud storage due to their applications and workflows.

So, before you head to the cloud, consider the following points carefully:

  • Make sure your apps are supported with your specific cloud storage app. Many applications cannot work reliably with how Dropbox, OneDrive and Box Sync interact with the local file system. SketchUp, Photoshop, Indesign and countless others can barf everywhere (i.e. conficted copies, failure to save, corrupted files, etc.) when trying to save to Cloud Storage.
  • Bandwidth will get gobbled up. Make sure you have a lot of it if you have lots of people in an office that are using the file sync. An office of 50 creative workers we support has 300MB/300MB fiber and users still complain about file open/save times.
  • Get used to delays between when users save and when other users can see the changes. It will never be instant like it is with an On-Premises file server.
  • Security granularity is greatly reduced. Need to allow certain users the ability to Add to a folder, but not delete or modify? Most don’t support this ‘edge case’ and countless others.
  • File Locking is rarely a feature. Last-to-save-wins. Even with good version control, this WILL cause your users problems if they are used to File Servers.
  • For “sync” style solutions, you must have local storage available on the local workstation to sync all the files they will access via the local sync (the quickest and most comfortable way for users to access files). If you have large libraries of files that users “must have access to at all times”, prepare to add storage to workstations and laptops to hold it.
  • Some solutions are starting to offer “streaming” solutions where files are streamed to your computer from the cloud on demand rather than keeping a local copy. Still in their infancy, some of these are really good ( and others are still buggy (Google Drive). Large files remain their Achilles heel if you have limited bandwidth.
  • If you’re using any server-side solution to index, scan or otherwise interact with your file server on the backend – these will not likely work with cloud storage applications.
  • Service levels vary incredibly. We have at least one customer who lost untold amounts of data as a result of the provider releasing defective code into production. Their response to the customer was to suggest they ‘keep an eye out for corrupted files and restore them from backup manually as they are found’.
  • You need DR. Never depend on your data to be fully protected by a single cloud vendor. The bigger they are, the less they care if you have an issue. They are on the hook for the amount of money equal to or less than what you spend on their service in a given period of time. The 3-2-1 rule is as relevant as ever – perhaps more so.

The Data Silos are Back

Ten years ago in AEC technology, we were talking about trying to find ways to get data out of ‘silos’ – that is, storage and dissemination systems that didn’t integrate with each other. Files and data were strewn about in different formats and couldn’t be easily told to play nice.

Over a very short period of time, many firms tackled of the issues with silos – some even claim to have solved it. An ever increasing portfolio of data and systems interact with each-other quite well. That was before cloud-based-everything was the new normal.

As we migrate an increasing amount of data into the cloud into disparate systems in the name of cost, convenience and effectiveness – we’re creating new silos in these clouds that don’t know about each other. We’re back to having to keep tabs on where we put everything for projects rather than looking through a single folder structure on a file server and being able to ‘discover everything’.

Maybe that’s an okay problem to have. After all, AE firms are paid based on what they deliver – not what they can archive.

It does – however – call into question: What is the role now of Archiving and the Project Record?

Some food for thought.

But what does it cost?

You can look at a carton of eggs and do the math – $3 for a carton divided by 12 is a quarter per egg. Better yet, eggs are sold by the dozen. The math is already done for you. Your effort effort as a consumer to decide which carton to pick comes down to just a few variables – Grade, Color, Free-Range, “Low Cholesterol” – most of which are meaningless attempts to differentiate product, anyway. In reality, the products are nearly indistinguishable from one to the next.

Buying eggs isn’t a solution to anything. It’s part  of a solution with the goal of feeding ourselves. That solution is known as breakfast. What it costs for me to actually make breakfast is as unique to me as it is to anyone else. My kitchen, electricity, gas, pans, method – these all have unique costs associated with them. As a system, the cost is complicated.

So why is it when we are investigating a solution to a complicated problem, our first question is always what does it cost?

Because it’s easy. Asking what it costs lets you decide if you can do it or not. It justifies rejection based on your simplistic expectation of expected price. Too much money? NEXT!

What happens then if we don’t let ourselves off the hook that easily? What if we instead dig deeper and ask more meaningful questions?

What is the problem we’re trying to solve?

What is it worth if I solve this problem?

How do we qualify a solution as being viable?

Cost is always past tense.

Caught in the Middle: Subscription Software

pay-button-on-keyboard-630x300Software Subscriptions can feel like a vampire. They lurk on your OpEx ledger, claiming a value proposition for each month, hour or mile you use them. Gone are the opportunities to squeeze extra value beyond the design intent of the software. You can’t stop paying for them in a slow cash-flow month if you still need to use them. That feeling that you “don’t own” anything starts to sink in. The fear that if you leave it will cost more to find something new starts to materialize. Hiding under your desk sounds like a good idea.

Welcome to the new age of Subscription software. It’s everywhere now. It’s not going away either. Here’s why:

Customer loyalty is only skin deep

In hopes we would spend lots of money with them, software companies often gave deep discounts to get us in. In many cases, these investments would be the last time the customer spent money with the software company for years. It took a lot of customers paying a negotiated rate for software to make a profit. Remember that the people that code these solutions routinely capture $100k/year+ salaries and executives many more times as much. Add on to that the need to have a sales force in the first place (which isn’t cheap either) and you have a steep mountain to climb to profitability as a software company.


It’s easy to pirate software that’s perpetual and use it for as long as you please. Just ask Adobe, AutoDesk and Microsoft. It was so bad, that to combat it, they created the Business Software Alliance to go around and threaten to sue companies in high risk verticals if they didn’t submit to an audit. Many Mid-Sized and Small companies were audited and most were found to be out of compliance to some degree, though it’s unclear as to how bad it really was.

Upgrades aren’t always enticing

Many software companies would charge about 2/3 the original cost of the software to ‘upgrade’ to the latest version. Some could justify the cost if the value proposition (new features, typically) meshed with their business. Smaller businesses learned that they could skip versions or never upgrade and be fine. This meant that a new release might only capture a small percentage of existing customers. If it didn’t entice new customers, the new release could be a complete flop. Software companies learned that this was not working consistently. After investing sometimes millions in a new release, new revenues from it might only be a fraction of that.

Support costs are high

All software depends on other software for it’s reliability (Operating Systems, Drivers, Databases, etc.) When there are at least as many deployments of your software as you have customers, the cost of support goes up. One of the primary reasons is because each deployment has it’s own set of risks and potential for supporting software to have issues. In most cases, you’re left holding the bag as a software company to prove to the customer that the issue isn’t your software. In major cases where a supporting piece of software takes your software down, you as a software company typically have to release a fix at your own cost. It can get expensive fast.

Agile Emerged

Some very smart folks came up with a manifesto for developing software better. Known as the Agile Manifesto, it dictated, among other things, that software companies release new versions all the time with minimal bug fixes, features and little to no upgrade labor on part of the customer. Software companies who started into Agile quickly realized they produced a higher quality product as a result. Since no customer is going to pay for an upgrade that includes only minor new features or bug fixes, it didn’t take long for early Agile adopters to completely abandon perpetual licensing.

What can be done?

Software companies are trying to improve their businesses just like everyone else to be relevant into the future. Your job now as a Business Manager, CIO, CTO or CEO is to evaluate how you will manage this transition to “Pay as you Go”.

Communicate with software vendors

Find out what your software vendors intend to do in the future. Are they moving to a subscription model? What’s their roadmap? What will it cost? What’s their emerging value proposition? The worst thing you can do is sit idle and wait for things to happen.

Plan on a hybrid environment for a long time

Some of the perpetual software you have in your datacenter now probably has no subscription based replacement yet. Plan on maintaining these applications using as much virtualization as possible – even cloud based virtualization if it pencils out (note: it doesn’t always make sense to virtualize in the cloud).

Hire a Business Technology Consultant

Since your IT consultant isn’t going to be doing much for you anymore, it’s time to find a good Business Technology consultant to help out. A Business Technology consultant is going to help bring new systems in and manage them with you under this new model. A good Business Technology consultant should specialize in your particular line of business. Most of what they are going to be doing is matching technology, workflows and process to your business – not adding equipment to your server room.

Shift your technology spending

Take a look at your budgets and accounting classifications of technology spending. Most of what you will spend on technology in the future is pure OpEx – very little will still qualify as CapEx. Check with your Business Technology and Accounting professionals to adjust you’ve got your books and budgets to correctly accommodate this shift.