How to NOT be a Terrible Software Executive: An Anecdotal Study.

I began my computer programming career roughly 20 years ago with a trial-by-fire in the smoldering ashes of Control Data.  Control Data was once a giant in the industry as much as Microsoft and Google are today, and Minneapolis was  a nucleus of the tech world.  When Control Data went under, it gave birth to a number of spin-offs, including the company that I worked for, PLATO Learning.   As a young 18-year-old kid, I learned a lot from the former Control Data engineers.  I ate lunch with them every day.  I picked their brains.  I heard their stories about the dawn of computer science, Star Trek, and … cats.

One thing about Control Data programmers was certain… they were not intimidated by real programming tasks.  They were around when computer programs were written on punch cards. Many had been working on the same product lines for 28 or more years.  The PLATO learning system was a platform that pioneered some of the very first multi-player games, and featured the very first plasma display to ever go to market.

These guys, were absolutely not intimidated by challenges.  I began my career in 1995.  The guys liked me because… well… how many kids born in 1977 could say they programmed a TRS-80?  In 1995, Internet Email was just finally becoming standardized away from proprietary systems like Prodigy and America Online.  SQL Servers were a relatively new thing.  Most people payed for dial-up internet on a per-minute basis on 2400 baud modems.  If you were lucky you’d get 9600… or if you were really super lucky…56k.

The guys I worked with were database programmers.  They knew how to make databases, and when I say “make databases”, I don’t mean that they know how to make a relational database using MSSQL or Oracle or MySQL… they knew how to MAKE MySQL.  I worked with one guy who contributed code to every SQL server on the market at the time with exception to Oracle.   In a nutshell… a database server is really no big deal, and in-fact, I have major reservations regarding many aspects of SQL.

These guys were smart, and if you clearly specified what you wanted, they could engineer code to fulfill virtually any task you would have for them.  It was pretty exciting, because at PLATO, I felt like, in many ways, we were on the verge of some new cutting edge technologies.  The internet was connecting people and educational institutions in such ways that we had a real opportunity and muscle to become a world standard in educational software.  I went to my job every day with enthusiasm, excited about the things that I was designing and excited to work with the people I worked with.

Where is PLATO now?  Well.  They nose-dived, became privately held and renamed “Edumentum”… or maybe they were bought out, I don’t know really… I guess in some ways you might consider them failed, a faint shadow of their glory years, but my reports from the few friends I have left there are that they’re making a comeback.  Some blamed their nosedive on 9/11, which caused the US Government to shift focus away from education and onto terrorism and military spending .  Certainly, PLATO’s stock tumbled immediately following 9/11,  but it was struggling to keep up before then.

From an engineer’s perspective, though, I have thought about it, and I pin the blame for PLATO’s past failures squarely on shoulders of the executives.  It was entirely executive decision making that sent PLATO down the wrong path at the wrong time.

I’ve thought about it for many years and came up with this list so that you, as a software executive, can maybe avoid the mistakes that PLATO made.

  1. Listen to your engineers, particularly if your veteran engineers tell you that something is a really bad idea, then it SHOULDN’T be done.

One of the biggest mistakes made at PLATO, in my opinion, was the decision to convert their entire library of educational content to be “web enabled”.

I’m sure many of you are thinking I’m a fool.   After all… wasn’t the world trend in 1995 towards Web apps and web-everything? Wouldn’t it be totally foolish for PLATO to pass on such an opportunity to join the web bandwagon?

Well, I carefully used the word “web” and not the word “internet” in my statement.  The “World Wide Web” and “The internet” are two very, very different things.  The web is really just part of the internet that refers to content delivered in a web browser with HTML and hyperlinks, riding on top of the HTTP protocol.  There are thousands of protocols that run over the internet and HTTP is just the one of them.  HTTP is just a poorly conceived text-based protocol that involves sending a little bit of text over TCP port 80 to a server which issues a little text reply followed by whatever file it was that you requested.  It rides on top of TCP via port 80, which is just one port among thousands available to software engineers to pump data.  All of that runs over IP, the “Internet Protocol” and IP runs on whatever local network protocol you happen to be on, typically ethernet (802.11) or wireless ethernet these days, but others exist and were popular in the past such a Token Ring (Apple’s joke).

I think that if PLATO had moved to a pure form of “internet” content delivery, then they would be the undisputed champion of online education right now. Pure Internet delivery is what I and the other engineers urged them to do… but that is not what they did.

Is HTML in a web browser really the best place to put educational content, including tests, multimedia, courseware, and content management?  Is it the best place for it in this decade?  Was it the best for it in 1997, when Internet Explorer and Netscape Navigator were fighting to be the dominant browsers on the market?  Absolutely, definitely, not!  It wasn’t right in 1997 and probably still isn’t right even today with HTML5 and all that nonsense.

There are all kinds of things that run on the internet that don’t use HTML or  HTTP, some of them don’t even use TCP for that matter.  There’s SMTP which delivers your email, POP3 which gives you access to your email box, FTP for transferring files, IPSEC for building virtual tunnels between your office branches, Skype for video conferencing, and there’s nothing stopping you from simply opening up a port and pushing bytes across the internet via whatever methodology best suits your application as long as…. you don’t rely on a web browser for those things.  Web Browsers require you to play within their little sandbox… their own little paradigm.

So as an executive, when your engineers recommend a technical compromise that will help you achieve your business goals you probably shouldn’t overrule them and make their technology decisions for them.

The conversation went something like this:

PLATO: “We’re going to bring educational software to the Web!”

ME: “You mean, ‘The Internet’, right?”

PLATO: “No, we’re going to deliver it all in a Web Browser!”

ME: “Web browsers do not possess the features and capabilities we need to be able to do this.  I really recommend that you don’t.”

PLATO: “Make it work anyway!  We’re going to do the web because everyone else is doing it!  The web is the future!”

ME: “Uhhh… it won’t work.  I’d recommend a more technologically sound solution.  How about a pure internet delivery mechanism?”

PLATO: “But if the courseware is in the Web browser, then our customers won’t have to download anything, which is a big selling point.  We want our customers to have full access to our content online!”

ME: “Well, you will have to download stuff… lots of stuff including 80MB  worth of plug-ins to every computer (which in 1997 is quite a bit to download).  If you have to install plug-ins to accomplish your task, then you might as well just install binaries outright and have full control over the end-user experience.   Plus there will be all kinds of ways that we won’t be able to stop deviant children (and convicts) from doing things we don’t want them to do in the web browser.  This technology is just not suited for this task.”

PLATO: “Aw well, who cares if it works… as long as we can say that its on the web, we’ll be much cooler being on the “web” than if we actually have a product that works… so… we’re going to the web.  Can you have it working in 2 months?”

Obviously that last comment was an embellishment for entertainment purposes, but the rest of that conversation was more-or-less exactly how I remember it.

All the veteran engineers were in a agreement.  The web was a terrible delivery system for educational content.    What would have really worked, would have been to build a client app that could run the content in a strictly controlled manner, would have full access to the hard-drive and could download and install any required plug-ins without any user intervention… run Flash, Shockwave, and other content in ActiveX controls, or as simple EXE’s when needed.  We also would not be limited by the capabilities of the web browser… we could run full-screen, stay-on-top (so kids couldn’t switch tasks to other anti-productive things), take advantage of 3D graphics, play sounds, record videos, prevent unauthorized access to protected files, prevent hacking, prevent use of the browser’s “back” button or secret hotkeys (like Alt+Left).  We could run on computers that were locked down with special anti-tampering software and take full advantage of caching in our own way. We wouldn’t care what your favorite browser was or whether you had javascript and cookies disabled… there would be no browser…. we would be the browser. In fact, I made our early user interface prototypes look and operate just like a web browser to offer a familiar feel to it all.  But they wanted a standard web browser.  In my opinion, this restriction sank the entire product-line.

If they did it my way, it would have just worked and been far, far, less expensive to maintain.  Our customers would be happy, our product would be stable. Instead, it was the death of us.

Setting the ball in motion down the wrong path, we were forever plagued with a constant mess of dealing with plugins… plugins for Netscape… plugins for Internet Explorer… plugins for Safari.  Our customers got angry with all the downloads so we started mailing them CDs with the plug-in installs burned onto them, just like the old days… mailing CDs to customers in the day of the internet.  On days that were not within our control, Adobe would release a plug-in update and it would disrupt the user experience for our users… and mind-you, our users were 10-year-olds in a computer lab that was supposed to be tightly controlled by member of the school staff, so it was totally possible that their classes could be disrupted by a new Flash update.

Our code turned into a complete mess.  Schools didn’t like the idea of enabling cookies in the browser.  They didn’t like enabling Javascript either.  It was a configuration nightmare and the limited power and capabilities of the Web Browsers of the day were further limited by our inability to use them to even their limited potential.

HTTP and TCP were a problem as well.  TCP wasn’t ever designed for use on the scale of the internet particularly due to the optional nature of keep-alive packets which can lead to half-open connections (which conveniently only become a problem after your app goes live on the web).  I’ll be covering that debacle in-depth separately.

2) If your veteran engineers tell you it can’t be done. Then seriously… IT CAN’T BE DONE.

Another point of tension.  The database.  SQL wasn’t a totally new thing back then, but it was fairly new.  And anyone who played around with the early SQL servers knew that they were pretty spendy… both in terms of price, and in terms of the CPU power required to run them.

I took one look at Oracle and MSSQL and immediately realized that they were not going to work for us.  Oracle was scalable, but certainly no speed demon.  It scaled well and was trusted by many institutions who had lots of money to spend on it, but I also noted that those institutions that spent $150,000 on their Oracle servers also spent a lot of money on the DBAs to maintain them at the same time, and the higher-ups at PLATO wanted to give away my software for free then up-sell the clients on the courseware.  I figured the “free” price tag immediately trumped all other requirements… and $150,000 was far from “free”.

Oracle required maintenance.  Lots of it… apparently enough for a full time DBA.  On top of it all… Oracle could only deliver you the speed and scalability you sought if you used it in the way it was designed to be used.  Frankly, well… It was designed for accounting departments, and its design did not suit every purpose.   90% of the programming jobs out there at the time were boring accounting jobs… but ours wasn’t one of them.

None of the engineers wanted to use an off-the-shelf SQL server and we had been doing our preliminary designs with a 3rd-party relational database that was not SQL and planned to build a simple server app that our client app would talk to to ensure that data integrity was maintained at all times.   We knew that if we were in control of our own file based database we’d see a 10,000:1 performance boost (seriously) and knew that it was the only way were were going to be able to walk the database efficiently with regard to a few tough requirements that affected our table relationships.  The sticking point was a type of relation that no SQL database on the market could handle… in-fact, I don’t think that there’s a SQL database on the market that can do what we needed today, let alone in 1997.

The database design called for a multi-parent, multi-child tree structure.  As opposed to a single-parent, multi-child tree structure (as most trees are), any node in this tree could have any number of children and any number of parents.  Oracle was the only database at the time that could do tree queries… however, Oracle could not do this kind of tree query.  The only option around it was to perform a whole bunch of smaller queries.  Oracle virtually ground to a halt processing  hundreds, even thousands of these tiny queries.  The way our scoring system had to work (based on the marketing requirements) involved rolling up and down the trees to give students scores against multiple alignments tied to teaching standards. The engineers unanimously agreed that it just wouldn’t work and shouldn’t even be attempted. We warned them… they overruled us.

“If we’re going to be a serious company, we need a serious database,” they said.

“Just because something is overpriced, doesn’t mean that it is serious,” I countered.  “We have the brains to make this work, but no SQL server is going to make this work.”

Subsequently,  years followed full of cussing and all-nighters trying to get SQL to do things that we simply knew it wasn’t cut out for.  I left the company before they ever managed to get Oracle to do the multi-parent, multi-child roll ups with any kind of respectable performance and the MSSQL performance was passable only with very low user counts.  As I walked out the door on my last day, I told my boss, “There’s going to be a mass exodus of executives on the way.”  Two months later, several executives within the company were forced out by the board of directors.

3) Don’t be too chicken to create something amazing.

The fundamental question you should ask yourself is: Is Google where they are because they followed industry trends, or because they defined them?  The industry is defined by trend setters, rarely trend followers.  But if you’re going to set a trend, make sure that what you’re doing is too difficult for anyone to easily copy.   Create barriers that are difficult for your competition to break through.

Maybe you’re the kind of person who reads tech magazines to pick up on some of the development techniques that other companies are successfully employing.  You do your best to emulate them, using the same tools, buzzwords, architectures.  Maybe because of this you feel like you’re well informed and a good executive, but I say that when you read the tech magazines and blogs, what you should really be asking yourself is “what are they doing wrong and how can I dominate them?”

What is your secret weapon?  To be a company that mows over the competition, you need to be able to lead, not follow, the industry… so get some guts and build something that is difficult to build for the right reasons.  The more difficult it is to build, the more difficult it will be for your competition to emulate it.  Don’t be scared to write things from scratch.

There’s no way in hell that Google powers their search database from an Oracle server… there’s just no way.  Think about that.  Google built a very highly customized storage engine from simple parts and hundreds of thousands of servers (some reports account for millions of servers owned and maintained by Google).

 

4) Just because the rest of the industry does it one way, doesn’t mean that you should do it the same way.  

So many companies are too quick to pull the trigger on some new web app.  They run their internal operations on HTML/Javascript.  They seem hellbent on using the most thoughtless designs and cumbersome development tools to get the job done.

To that I say, “If you didn’t start building your app on a horrible foundation to begin with, then you wouldn’t need to be rescued by the horrible hack that is AJAX, nor would you need to be excited about a rehash/revision of something that was ill suited for the task at hand 15-years ago and is still ill-suited today.”

“HTML5 is the future!” says the industry.

On the contrary, however, here’s a quote from Mark Zuckerberg, founder of Facebook explaining why Facebook was stumbling in 2012.   “The biggest mistake we made as a company was betting too much on HTML5 as opposed to native … It just wasn’t ready.”  You can read the whole article here.

It is 2014 as I write this… is HTML5 ready yet?  Maybe you’ll argue that it is “good enough” for your purposes, but the thing is that real brick and mortar technology has always existed to meet your requirements and always has existed.  Chances are that your company is just too chicken to invest in it, and still I say that if, for any reason, the technology to get your job finished does not exist yet, then I say that someone, maybe you, should just go ahead an invent that technology.

I would go as far as inventing a new programming language or compiler if such languages and compilers did not exist that were suited to my purposes.  Software Executives are too quick to get cold feet if what they’re tasked with accomplishing actually involves some real work and real innovation.   The pitfalls of sticking too closely to slow-moving poor standards, such as C, Javascript, HTML are felt every day in the industry and, in my opinion, cost the industry billions in lost productivity every year.    All of these tools are terrible for their intended purposes, and as technology grows, these tools only become more and more cumbersome…  yet somehow they are widely adopted.

On the other hand, one of the most exciting new open-source developments I have come across recently is this fabulous new operating system called Cosmos, which is an operating system written entirely in C# and a brand new type of assembler language that they call “X#”.  Start with brilliant foundations, namely a brilliant programming language such as C#, and you’ll save yourself time and money in the long run and will be at the forefront of innovation.  Cosmos, I predict, will be a real contender for virtually-hosted operating systems in the future, but may never make it to the desktop or laptop due to hardware driver limitations… but the virtual server market is big already, and that may be the only place Cosmos needs to exist.

5) Understand what it is that you’re building.

I worked on one particular project recently which involved gathering data from sensors.   The data gathered from the sensors was limited in its usefulness and I did everything I could, but I simply had to concede that the data provided to me could not be used for the purpose they asked me to use it for.

THEM: “But people take on these kinds of tasks all the time, there are lots of studies in universities going on right now around ‘machine learning’, let’s hire someone from the University to look at the numbers.”

ME: “You can do that, but I’m telling you that math is math, and you’re just wasting your money.  I need better data.”

I was, of course, overruled, despite my articulate attempts to convince them that the truth I found in the math was absolute… we hired the University to look at our data, which I had already concluded was a lost cause.

I and they spent 6 months parsing this data through some algorithms that some PhD’s had put together for analyzing data.  We used MATLAB, and Weka Explorer to run the data through several fancy-pants algorithms.  Ultimately we wanted to know the answer to one simple question: “Do you see a finger on this surface, or not?”

The executives in charge made the false assumption that general purpose machine learning algorithms could be applied to this mess of data verbatim and with enough iterations and trials the machine would eventually learn what the data should look like when a finger was on the surface.  We ran the algorithms over and over… and they all claimed that they were 99% accurate…. however, the results were unusable.  99% accuracy, for starters, was not good enough.  We needed 100% accuracy, and in practice, I can tell you that it wasn’t really 99% accurate.  It may have been 99% accurate against all the data that was collected, but it wasn’t 99% accurate against all the possible data combinations that the real-world would throw at it.

I summarized the problem as being one of “dimensionality”.  Through simple mathematical analysis, we could safely determine that there was no feasible way to use the data this way.  Say, for example, you have one byte that represents a decision.  With just one byte,  there are 256 different possible values that would potentially affect the decision being made and the machine would pretty easily be able to determine which of the 256 numbers corresponded to “yes” and which corresponded to “no”.

But, instead, lets go from one byte to two.  Now there are 256*256 possible combinations (65536) possible inputs that would need to be considered… starting to look a little trickier, but possible, assuming that the data was a perfect representation of the finger states (which it wasn’t).

Two bytes = 16 bits.  And each bit you add, doubles the complexity of finding the result, and we had up to 28 bytes for each sensor position, and 42 sensor positions that were all unique.   Even with just a single sensor we were up against 2^224th possible combinations (2 to the 224th power).  If you’re not one who cares much about math, I’ll tell you that that is a 68-digit number that my 64-bit CPU is not capable of calculating and Google doesn’t even know the answer to that one.    Google only knows the rough answer: 2.6959947*10^67. I could make up the rest of the numbers and write it out… it might look something like this: 2,695,994,712,345,678,901,234,567,890,123,456,789,012,345,678,901,234,567,890

At this level we might as well be searching for the Higgs Boson particle and be building a 17-mile underground research lab.  At best we were able to get the computer to make decisions as to whether it was “probable” that a finger was there, but really, we needed binary accuracy.  It needed to be correct 100% of the time.   What if your computer could only detect a mouse click with 99% accuracy?  You’d probably go crazy right?  It would miss 1 out of every 100 clicks.

Without first summarizing this data and preprocessing it into something more friendly, there was just no feasible way this thing would ever learn anything of value.. and even after the data was processed to be better suited to machine learning, I still figured we could have the computer capture data for an entire year and it still would not learn the proper answer.

The moral of this story is: don’t trust an algorithm to do your job for you unless you absolutely know for certain exactly how that algorithm is processing your data.  Furthermore, if you’re well-informed enough to understand how that algorithm works, then you might as well just modify that algorithm to suit your needs or invent your own custom algorithm inspired by it.  I think that when Universities teach students about these algorithms, what they should really be doing is teaching them the nuts and bolts of how and why someone felt the need to craft such an algorithm in the first place, and therefore how a crafty student could apply new, similar, custom algorithms to future problems.  Such algorithms should be treated as methodological inspiration, not universally useful tools.

Finally, if you don’t understand what it is you’re building, then step aside and trust the people who are working most closely on the problem to do their jobs.

6)  If you’re a software company, then treat the software you create as equity in the company.

Treat your software like equity.  A real estate holding firm treats the properties on its books as equity with value.  Similarly, any software your company produces is potentially valuable and equitable.

I’ve completed some pretty serious tasks in the name of getting things done right.  I’ve built SQL servers from scratch, designed and implemented scripting languages, custom load balancers, ISAPI plug-ins, Video formats, Audio Formats, Audio engines, 3D engines, compilers, reverse engineering tools, web servers, new internet protocols, real-time protocols, custom data marshaling, code generators, documentation generators, virtual disk drives…  lots of stuff.

“Why not just use what’s available?”  You ask.  Well… first of all, sometimes what’s available doesn’t do what you need it to do, and secondly, once you’ve built something, it is yours to do with as you please and those special capabilities you put into it give you an edge over the idiots out there that are content with using off-the-shelf garbage.  You have a new weapon in your toolbox that no one else has… it is equity that you own that will potentially stay with you and your company for decades if you do it right and make the right decisions. Everything that you make and don’t give away as open-source makes you more powerful than your competitors.   Treat your software with respect, build it to last, and don’t rush poor code to market if you have any intention keeping it around for the future.

This also means that you should start with stable foundations.  This is another reason that HTML and the Web are terrible foundations for any software project… they change… a lot. A new update for Chrome may one day break your whole application.  Some plugins are discontinued for business reasons, browser features are broken for security reasons… and if you rely too heavily on 3rd parties to provide your tools for you, one day those 3rd parties may unilaterally decide to wipe out all the precious equity that you’ve built up over years of coding. PLATO lost several market-worthy products because they were built in “ToolBook” and the company that made ToolBook decided it wasn’t in their business interests to continue supporting it.  Who’s ever heard of ToolBook these days?  If I recall it was a kind of hyper-card rip-off for windows.  Does anyone use hypercard anymore?  No… absolutely they dont.  C and C++ are still around however.  Stick with tools and languages that are going to be around.  C# is currently the best option.   It is now cross platform compatible and can be AOT or JIT compiled with many open source compilers in addition to the standard Microsoft compiler.   It will now even deploy to iOS devices.  Obviously languages like Java and C will be around for a while as well.  But many languages have appeared in the industry, made headlines, then vanished — too many to count.  Anyone remember PowerBuilder?  Paradox? Cold Fusion?  There are languages in use today that are at high-risk of disappearing in the next decade, including, PhP, ActionScript (already seeing the writing on the walls), Ruby, Perl, Objective C (will replaced by Swift and forgotten like your Macintosh Quadra 950… just because that’s how Apple rolls).  JavaScript has to go sometime, and even if Java is going to be around for  a while (I’ll give it 20 years), I’m pretty sure the world will grow sick of it eventually.  You’ll laugh at these predictions now, but you’ll eat my words later.  I’ll bet you that at least 50% of those languages are dead in 10 years.  😉

Well anyway.  I hope you found my ramblings useful.  If you’re looking for a new software executive, make me an attractive offer 😉

 

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.