Of all the IT management mistakes you could make, these eight might be the biggest potential threats to your career

The professionals at Aleph Infotinuum Services contend that there is one mistake which is even bigger: pretending that the bureaucratic programming one received from one’s childhood schoolmasters can pass as surefire instructions for qualified business management.

Businesses don’t exist to provide jobs or to satisfy owners (e.g. shareholders). Businesses exist to serve consumers — and the extent to which the employees & contractors working for a particular business are successful at serving consumers determines shareholder dividend as well as the quantity of available jobs.

Despite what your associates fresh out of MBB (Master of Business Bureaucracy) school might think they’ve learned, business managers don’t exist to grow their employer’s business. Business managers exist to serve consumers — and the attitude they bring to such consumers-come-first duties determines their professional legitimacy (if not their material compensation amid an economic climate of too many MBBs confusing guild credentials with actual qualifications) as well as the long-term growth of their employer (as opposed to MBBs focused on short-term quarterly reports delivered to the bureaucrats pretending in turn to be their managers).

That’s not to say that it’s only ever a one-way street. Consumers who nickel & dime their various vendors sometimes find themselves shut out of future opportunities for meaningful association. Supervisors who nickel & dime the coworkers whom they pretend are beneath them sometimes find themselves without anyone willing to work either hard or smart — and if they insist that pleasing them is the only means toward pleasing the employer’s customers or clients, they are unqualified to call themselves business managers.

There is good news: after you come to terms with the big picture business management stuff, day-to-day pitfalls such as the ones outlined within the article linked below start to seem more like temporary annoyances than chronic pain points.


Be thankful that some researchers are discovering early vulnerabilities within Smart Contracts

There is no such thing as a real-world building that is burglar-proof, and there is likewise no such thing as a computer-based technology which is 100% protected from hackers & crackers & malware.

Researchers have discovered what they are calling trace vulnerabilities within blockchain-based Smart Contracts (a trace being the electronic version of a paper trail containing a list of every instantiation of a particular contract). They created three explanatory categories for such potential vulnerabilities:

  • Suicidal — there exists a theoretical possibility that any arbitrary intruder (e.g. a hacker/cracker) could destroy a Smart Contract altogether (note that it is possible for a creator or similar trusted administrator of a contract to execute a SUICIDE bytecode instruction on the blockchain’s virtual machine for purposes of voiding that contract)
  • Prodigal — there exists a theoretical possibility that any arbitrary intruder (e.g. a hacker/cracker) could leak details of a Smart Contract, including “coin” payments for services rendered, to arbitrary parties
  • Greedy — there exists a theoretical possibility that any arbitrary intruder (e.g. a hacker/cracker) could lock parties within a Smart Contract (i.e. make it impossible to execute a SUICIDE bytecode instruction on the blockchain’s virtual machine for purposes of negating that contract)

As a concept, Smart Contracts remain immature yet promising. They are still difficult to test, or to revise once established within a blockchain’s public ledger. Current trace vulnerabilities appear to crop up after multiple invocations of this or that contract, although this might be due to the relative obscurity of contracts which have not seen much use.

https://arxiv.org/pdf/1802.06038.pdf (PDF)

Back to basics: starting a new instructional design project

Getting answers to these questions will save you valuable time and money.

A lot of hard-earned expertise in instructional design involves juggling information in the analysis and design ADDIE stages. Although the first three stages Analysis-Design-Development appear linear, they often aren’t, and instructional designers have to manage unknown and vague information even before starting a needs assessment.

Here are some tips on information to gather before starting, why it’s necessary, and how it feeds into analysis and design.

  • The “who” includes the project client, SMEs (Subject Matter Experts), and end users. Don’t assume that the person requesting the training is the client. The client pays the bills and has authority over the project. SMEs validate the content with instructional designers and often have valuable information on end users to feed into a needs assessment.
  • The “why” is the business goal or need that creates the training request. The client is best able to answer that question. If there’s isn’t a business need behind the training request, training may not be required at all.
  • The “what” is the proposed training content. If the content is unstable, or not known, you’ll need to account for that in an estimate as that often requires more SME time and effort. You’ll also want to consider the resources to apply to a short-term or temporary solution.


System crackers and malware malcontents prefer windows & apples to penguins

If your employer is selecting a desktop operating system for employee use, consider the security bonus of Linux’s relative obscurity.

Many people resist change. That is why there is an entire service industry dedicated to change management.

Change is still necessary, because healthy economies feature plenty of market-based competition among producers for consumer loyalty. Sometimes, competition leads to misguided attempts to differentiate. Sometimes, it transforms the industry itself. Business managers who can ignore the vaporware to focus on the business imperatives will always be in high employment demand.

IT security is, of course, one of those business imperatives. Not many enterprises can avoid having an IT infrastructure, even if it amounts to a single PC connected to the internet. As more things go online (to borrow a term from the emerging Internet of Things), there is a corollary increase in opportunities for cyberattack.

Linux is a mature kernel, and its various graphical desktops (e.g. Gnome, KDE, etc.) provide users with productive experiences which are as intuitive as any from Microsoft or Apple. As a security bonus, fewer crackers & malware distributors target Linux-based systems precisely because there are fewer Linux-based systems out there.

Still, switching is not a simple decision. Better security comes at a price of greater possibilities that your customers and supply chain vendors will be unable or unwilling to view some of the files that you create on your Linux boxes. Open Source projects try to maintain compatibility with other operating systems, but a key to their differentiation is their owners’ ongoing attempts to make it difficult to work with anything other than their product. Fair enough — the bottom line is that you as a competent manager of your employer’s tiny slice of the eternal infotinuum must evaluate the risks versus the rewards of business imperatives like security and productivity.

Alephnote: Some of the worst security out there, regardless of the operating system you might choose to run, happens within The Cloud (but is sure is convenient for business managers who emphasize such things).


There never weren’t going to be problems with Open Source software

That doesn’t mean, of course, that Open Source projects are useless — rather, it is an acknowledgement that nothing is perfect.

There was a time when software was, like documentation, a complementary if proprietary add-on to a hardware purchase. Software was free of charge in that its development costs got baked in with the price of the hardware on which the mission-critical software was to run. Those were the early decades of computing, before the term “personal” became a market-expanding prefix and software gained a wider audience for more commodified hardware.

Then the inevitable happened: software also started to become somewhat commodified, as languages and frameworks and libraries and protocols attained their own reputations among developers as de facto standards. There were and are plenty of areas remaining for innovative differentiation and even niche applications/utilities, but the profit margins for software have been approaching zero for businesses that lack a diverse product line of interoperable functionality which extends across vertical or horizontal market boundaries.

Plenty of innovation — during the dawn of personal computing in the 1970s and 1980s as well as during the internet era — has come from developers committed to delivering what is known as freeware or shareware or, more recently, the amalgamated Open Source movement. Disparate Open Source projects hoping to maximize their software’s interoperability with other projects will often turn toward established cross-platform standards as well as middleware products which enable EAI (Enterprise Application Integration) mapping and ETL (Extract/Transform/Load) data compatibility.

Business managers themselves like to extract what they can from this or that Open Source software project, especially if they can get away with not contributing anything back to the codebase (Aleph Infotinuum Services will leave it to the individual to decide whether such tactics lack ethics — AIS comes down on the “no” side with a caveat that Open Source ideals are mostly of the commune-style misguided type whose inherent inefficiencies are easier to hide within virtual marketplaces featuring intangible products).

As with any typical commune, delivering quality products at regularly scheduled intervals proves to be difficult if not impossible. As with any typical commune, the concept of success becomes warped toward feeling rewarded for self sacrifice. As with any typical commune, things get bureaucratic instead of entrepreneurial. Most Open Source projects lack proper incentives, or at least they did until a band aid solution came about in the form of bounties for effort (e.g. for creating new features or hunting down existing bugs).

One big problem with that, and with Open Source in general (and if you’re going to be even more general: with any kind of collectivism), is that developers without defined compensation for defined duties tend to put much more work into creating new features than they put into hunting & fixing bugs. Be sure to follow the link below for more Open Source shortcomings.


Dropping these 5 bad habits will help your career

Other sagacious advice includes avoiding chili before a big meeting.

Superstar business managers never tire of seeking ways to help differentiate their employer from other employers. Sometimes, such efforts backfire, thereby calling into question the manager’s judgement. Sometimes, they’re downright fraudulent, thereby calling into question the manager’s ethics.

The golden mean between superstar and fraudster is where most managers, indeed most professionals, ply their trade. They are willing to make judgement calls and accept accountability for those calls. On the occasions when they make a mistake, they try to learn from it and hope that they will get a chance to demonstrate the ways in which they were able to turn crisis into opportunity.

The fact that no one is perfect guarantees that each of us will, at least once in a while, make a mess of things, at least temporarily. Whether it’s procrastination, or pedantry, or positivism to the detriment of freedom to innovate, try to avoid repeating such messes.


Don’t let your organization’s CIO position represent Common Introspection Omissions

Ensure that whichever employee fills the CIO role can demonstrate Copacetic Infotinuum Operations.

The infotinuum is eternal. It is the proverbial Borgesian library of Babel. Fortunately, your employer needs to care only about an infinitesimal nook within such an endless labyrinth of knowledge.

The question that arises is: how to care for such a nook? Today’s typical CIO must be comfortable with more technological striplings & stalwarts than those from even ten years prior. Increasingly, business imperatives go nowhere without an efficient infrastructure of networking and communicative persistence — which makes infotinuum management the circulatory system of any enterprise lifeblood.

Employees comprise the enterprise heart. Contractors and supply chain vendors provide occasional infusions. Consumers oxygenate, and sometimes exsanguinate.

Here are a few things a CIO can do to help his or her employer avoid anemia or, worse, sacrifice on the figurative altar of consumer fickleness:

  • Stop fearing cyberattackers — from at least as far back as the BBS heyday, wise sysops (known these days as sysadmins) have recruited hackers, at least in a surreptitious manner, to help them hone their security practices
  • Spread the good-for-business word — offer more than just apologia by convincing other teams within your workplace to snatch up some of your own team’s talented personnel
  • Consider seamless business/infotinuum integration to be the only acceptable success — it isn’t possible to redefine any term, much less success (indeed some who try are doing so for purposes of making their mediocre efforts appear more successful)
  • Put the Copacetic in CIO — don’t try to be cool, just remember that those who don’t already consider their career to be cool are still looking for the right career


Epistemological death by streaming video?

What PowerPointâ„¢ did for presentations, streaming video threatens to do for learning.

When new slideshow technology started to replace overhead projectors in education, courses that featured an instructor displaying hundreds of digitized slides became known as death by PowerPoint. Institutions and businesses cut costs, but also forgot to keep in mind their customers’ learning demands.

Marshall McLuhan theorized that any medium contains traces of previous media, and that content matters less than the medium itself. Streaming video for course delivery show traces of the previous media it replaced, namely the dreaded PowerPoint slide deck. Watching streaming videos of a class lectures with no interactivity or engagement offers no pedagogical improvement.

Using the latest media, whether it is streaming video or virtual reality, does not always guarantee better learning. Inspirational, imaginative and meaningful learning starts with aligning media strategies with an analysis of the content, users, and objectives.


Measure twice, cut once

History’s framers & carpenters have some important wisdom to share with IT hipsters.

Perhaps your employer has decided that it’s time for a change (to be accurate: your employer’s employees make such decisions as they react to consumer demand). That sounds simple enough, except for the change part.

“Build it and they will come” is, of course, the stuff of fantasy. To put it into IT terms, it’s the stuff of vaporware. That’s why your organization needs a framework to outline the pathway toward the realization of synergetic outcomes. Or something like that.

Seriously, though, all stakeholders must have a say in the planning phase for any significant change to process or infrastructure, and managers must give the planning phase enough breathing room to articulate desires and goals and potential impediments that might even preclude the project under consideration. A multi-year action plan targeting perceived competition is inadvisable, since that implies a bureaucratic mindset instead of what your enterprise must always demonstrate: a consumers-come-first management attitude which leaves the competition struggling to come up with its own within-five-years-this-will-pan-out plans of bureaucratic futility. Let your competitors dig themselves into deeper holes as you concentrate on helping your employer serve consumers.

Think of such planning in terms of staying agile through short, iterative sprints of regular process revision. Don’t jump toward every new idea that gets entered into the backlog. Allow your coworkers that vital breathing room. Give everyone enough time to measure the cost & benefit of this or that strategy pivot, and then give them enough time to measure it again. That might seem like duplicating effort, but in actuality it’s an age-old strategy for making sure that you don’t cut the figurative branch on which your organization sits.

To use metaphors which originate outside the construction industry, companies are more like slow-turning freighters than they are turn-on-a-dime hydroplanes. Keep in mind the opportunity cost of repeated spin-down-spin-up rampages toward this or that trend, and play it cool so consumers want to hang out with you (to be accurate: your employer’s products).


“CHANGE NOW OR ELSE!” scream bureaucrats pretending to be business managers

When implementing new technologies and their corresponding processes, make sure those impacted understand the what/how/when/why of such a shakeup to their professional routine.

If your employer decides to introduce something cutting-edge, or even something about-darned-time, more than a few of your coworkers will balk at the suggestion that what they’ve been doing up until that point has been adequate for only the olden days. Although they won’t admit as much, many people consider a job to be a necessary evil which they must endure for the sake of what they consider to be their real life. Such people, despite making apparent career missteps, might in any case be valuable to the company — at least to the extent that replacing them would be more of a hassle and expense than it would be worth — and it is they who will therefore present the biggest change management challenge.

They don’t know what’s being changed, and they don’t particularly want to know because they prefer to concentrate on their Monday-through-Friday duty to pick up their kid from daycare. They don’t know how the change is going to take place, and they don’t particularly want to know because they wouldn’t have taken the position in the first place had they known that it would sometimes seem like school is back in session. For them the when of the project to implement a new technology is always too soon for comfort, and the why is typically more of a “Why me?” whine.

Don’t get too down on them. They are, for the most part, afraid, and even though such fears are, for the most part, unreasonable, many who so much as perceive a fearful future will do what they can to prevent any of the changes which might present the professional challenges they’d rather avoid. They don’t understand their own genuine influence — often to the point of wondering why their employer’s competitors are eating it alive as they try to ensure that no one need ever keep up with the innovating times.

Indeed, that is how socialism destroys societies. Corporatists want time to stop so they won’t ever face the cost of retooling or retraining, while welfarists want time to stop so they won’t ever face the limitations of their current skill set. Be instead a business manager who wants their coworkers to help them demonstrate to competitors how the not-so-scary changing times are always leaving behind the fearful.