Too Big (to … whatever)

As predicted, ERP has dropped from view in the world of business and/or tech.
Because reasons. Many of which have to do with the bigness of it all; one would really need to have a business so big to be able to capture all advantages of ‘seamless’ integration of systems into one, and not or, have to transform it so completely as to better have built the new one all from scratch (or schlepping in all the parts of the old, into the new mold). Which sort-of defeats the purpose. Completely.

Hence, the demise of ERP (as predicted here) has taken its final form. For now; we expect shrinkage of attention and market share to continue.

This, triggered by yet another laggard attempt i.e., by some government, to implement SAP througout at some Department but failed. Typical cost (overrun): €196M zooming out to €900M which only suffices to salvage some parts that work. Typically, because Reality didn’t seem to want to fit into the ideal mold set out but just went its own way, the way which it was on from before the start. Not even wrecking the project on purpose, just going along and not even noticing the project’s required changes as daily business had better things to do. As it was called: Where (the failure of) makeability [translations manufacturability and engineerability have also been seen in the wild] met reality.

So, no need to resort to Project Governance mumbo-jumbo. Just too big is reality, and:
Photo11
[Huh? Well, it’s Cyprus Meridien but why ..?]

‘sup, competitor ..?

Oh yes, of course. One couldn’t leave Docker to capture all the market as one understands the power of (being a) monopoly, right ..?
Was the first that crossed my mind when reading this here piece on D’s competitor Rocket. Still backing D, the Big G will now play them against each other ..?

Anyway:
DSCN6122
[Wouldn’t life without competition be like this, always ..? Resson, FR]

Oh, whatev’ – will succeed

Yes, critique hasn’t been overly enthousiastic for the HoloLens developments. Like in this here story.
Question is, though: Did the first iPod have Shuffle? Was the first iPhone even a serious phone ..? [Or was that the first iPad that had no comms; I forget due to irrelevance. But do notice how there’s now a continuum of screen sizes from smartphone via note and tablet to desktop screenlets and mega-TVs]

My take: It’ll be somewhere on the Glass–to–iPad scale: As prototype that stays (sic! Glass’s still around for very, very effective deployment in some sectors) and/or as lauch of a steep improvement curve.

Which is good. But may bring about some unforeseen consequences: What when Youth gets hooked, and unlearns what Reality is ..? Will we all follow ..?
Yes, if e.g., walls can be presented hologrammatically to a degree that hologrammar-Ns (you read that here, first!) are satisfied with reality resemblence, could an ASI take over and confine us in virtual (now for realz) boxes ..?

Dysto here, dysto there, dysto everywhere… Hence:
DSCN0647
[Mockery … Barça]

The Divide(s)

Surely I’m not the only one running up against total resistance (i.e., no discernable millimeter move) when one would want to even discuss disruption in industries hitherto untouched.
Like the financial industry. Really, the mindsets haven’t opened up to the 21st century at all. Despite in-roads by trading algorithms, something some even called a ‘financial crisis’ or so, and Bitcoin i.e. block chain trust.
But hey, at some point you just want some everyone to have their Kodak moment, right ..?

Maybe there’s three kinds of people:

  • The kind that embraces the New for what it is and brings; the innovators of course and the early adoptors. Ballpark: 10%
  • The early and late majority that tag along because everybody is doing it. About 60-70%.
  • The Laggerds, the retards (qua brain openness and movement…), the reactionary. The remainder 30-20%.

That’s not new. But the percentages may be different for various kinds of innovations and disruptions. The point being: How do you know where you are, if you’re in the third category and/or before the disruption strikes ..?
Yes, I do understand the flip at the other extreme, where Morozovlike second thoughs about what We as a global society would want. But that’s way beyond the frightened closure of mind that shrinks and shrinks consciousness (both meanings) ever further. Where you’d want to, metaphorically hey keep it real, bang someone’s head against the proverbial beamer display to make ’em see – with the contrary effect of disliking anything New even further in turn…

But OK. This was just an extensive RT of the LinkedIn link above… So, herewith:
DSCN2513
[Still standing (? as ‘t Schip), but gloom!]

P( Danger(You) > 0.5 ) ⇒ Shutdown( You )

For the Fellow Travelers among you, that still believe that AI (AGI or ASI) will bring us joy and an arcadic peaceful creative work-free life forever after, please do consider this here piece. And see that we’re only at the beginning.
[Oh for AGI/ASI reference, see here.]

Luckily, hopefully, the tide will turn. But there simply is no guarantee it will.

And on this most pleasant note, I’ll leave you with:
DSCN7386
[Málaga – but when the struggle is forbidden and ‘ratio’ quod non might seem to prevail, the Dark may roar and explode out of its confines in utterly destructive ways. As in this previous post…]

Digital Native Schative

A couple of weeks ago, there was this little row (that you may easily have missed) about some recruiter requiring digital nativity (yes.) of candidates (and whether that would be discriminatory since it would exclude ‘old’ folks). As in this here discussion.
Where the point was largely missed that one would indeed not want to hire anyone who would consider themselves qualified on this point…

As

  • Considering yourself such a native, or ‘born digital’ or whatever ridiculous phrase one could use, disqualifies you as you have no clue:
  • Those born in a time when there was already something digital (e.g., like stand-alone PCs), will still have grown up in environments with hardly if any of those devices. Either due to region (PCs were around in the US in the 80s, not so much elsewhere) or class (as if less moneyed classes had PCs in the US, before the 00s). Same / similar for all (sic) other ‘devices’, ‘systems’, and developments, that one could consider to fall under the ‘digital’ class if there were such a thing. If ‘born digital’ is about ‘computers’ having been around: that started in the 60s ..! If it is about pervasive ‘digital’ stuff being around: Those kids are still infants (mentally!), 0-20yrs of age; only some escape this nubness and indeed do understand technology.
  • So, there’s hardly anyone who could actually claim to be born and raised (sic) digitally. Maybe a handful, possibly placed outside their bio family by authorities as the digital overload would count as child molestation (compared to their peers, playing outside).
  • And, all the other kids may have actually learned something of the outside world in which one has to live (or be kept (sic) in a basement all their life…). May; apart from those that didn’t properly learn to ride a bike since they were driven around by tiger moms. Still, the ‘born and raised’ digital, would be of no use in the real world due to knowing nothing of it.
  • The ‘digital’ has in the mean time exploded. Is it about mobile, about social, about devices, about apps, actual applications, programming, security, business deployment, assembler, design (of ‘web’ sites (huh whaddoyoumean ..!?), apps, devices, brands, or ..?), privacy, economics, …, …? No-one can cover them all; some may cover a few but certainly not more. So anyone claiming to master the world because they were ‘born digital’, I show you the Fool. Ecco homo.
  • So you’d better not hire such worldview-morons.

But then, you could hire me. I was trained to work on mainframes (operations) and early PCs (use, programming down to C and assembler), have learned hardcore HTML (3, 4) back in the day and moved to ‘modern’ applications, and understand the Real World through education and experience (also in the business world), etc.etc.
Your call.

For reference:
DSCN6672
[‘Native’ …? Córdoba]

His Story

Oh yes just to drop it like that. On history and the importance to know it. So that you wouldn’t declare just anything to be a historic, unique event like a launch of some crappy piece of software (if it’s good, you launch way too late!) we currently call an App – which might within a decade be a laggard petit bourgeois expression.
Even in historic perspective, a great many Unique events are far from it, by light years. E.g., the Iron Curtain in Europe. Anyone remember Tordesillas ..? And the Great Wall of China, similar? And…, and …? Or, as the n-word has become an absolute no-no for anyone of other colour, what history does it refer to ..? Don’t you use the s-word now, as that stigmatises a people that were so deep into that that their people’s name became the Anglo-Saxon and much afterwards English word for their very fate. Hence, to use the s-word now, singles out that populace and degrades them much more than the s made by their own peoples (sic!), also through the centuries around the world (sic) without much complaint by almost all affected then. False claims for preeminence by attribution ..?
[Not to disclaim the dismal, atrocious treatment many past generations received. Should be remembered – all]

To return: Claiming uniqueness of events, or forging history to claim its compensations for ills not received (in person), makes one look truly stupid. Widening the gap between true unique-historicity and one’s own insignificance in stead of joining them. So stop making yourself the laughing stock of the masses, you clown (by innate character, not by role – an honourable one).
Or I’ll claim my ancestors were driven out of Africa by the ones still there.
The ones still affected today, however…

This discussion will continue, for a great many centuries IF humanity allows itself that much time. Therefore, now:
DSCN7306
[Defense / retreat ..? Andalucía]

FogAI picture

… Just to put it out there: What has happened to all the thrilling AI initiatives that flew around one after the other at the start of the year ..?
At that time, I even included some stuff in my Predictions, as so many new things were popping up. But now, … not so much. Because what?

Or have all the ‘leaks’ been thumbplugged and is development still going strong in skunk works towards a renaissance explosion sometime soon ..?

Whatev’; for you:
DSCN2101
[Its back being Mont serrat. Or so. ?]

Signalling healthy process

Yet some more cross-over ideas from the IoT world into the administrative bureaucratic office world: Streams of transactions as signals.
Of the health of the process, of course. To be defined, obviously, as the fit to the surroundings. The fit may be off, either intentionally (wanting to let the world adapt to the process, enforcing (?) change) or unintentionally left blank                i.e., having to cope with exceptions to what was envisaged as transactions’ content or form.

Now apply yesterday’s first picture of process control.
Now, too, consider what one could do with sampling theory (as a subset of ‘Shannon’, if properly elaborated, possibly skirting with ‘classical’ statistics ..?). Taking 2log(n) samples (where n is the number of transactions ..?? Just a wild guess) and being able to reconstruct the ‘signal’ then taking its integral (discrete transactions … just summing it up ..?) for the total. Or Fourier-transforming it all and … get your basic theory straight before dreaming of moving on so don’t start at the other end as ‘accountant’…! And/or treating exceptions (as e.g., found by the sort of analysis that these girls/guys are so good at; that not even being meant as a cynical qualifier) as noise to the signal. Never fully suppressable, but useful to pick up secondary signals, stacked in their variation of frequencies, amplitudes an wavelet transformations. That all tell you something, if you listen. Whether you want perfect, over-HiFi replay [intermission: Ugh I’m getting old, even knowing that HiFi was a thing…], or lively veracity, actual fullness of music. And take in again the ole’ industrial process control with its recipe / derivative function(s), et al., and be able to better control it all from the ‘dashboard’ in the control room. When all of the routine stuff, the routine 80%, of business is done by … ‘robots’. Humanoid or digital-machines, IDC.

And hey, while we’re at it, why not throw in attempts to include in bookkeeping not only discrete numbers (arbitrarily rounded to hunderds, of random currencies) but Real numbers or even Complex numbers as well ..? The latter, e.g., to indicate VAT surcharges, etc.; leading to tuples-as-single-‘numbers’ in bookkeeping. Maybe somewhat harder to track that all is booked correctly, but also maybe powerful in capturing singular transactions and some processing rules/logic, and controls, in one tuple (‘record’).

Where AI may then be applied to do sanity checks. Not on this author; no AGI or ASI would suffice…

OK, for now:
DSCN1436
[“What a shoe box” but yes that *is* the Bata shoe museum, Toronto]

ICShape

Doing some pondering, digging and backtracking on the issue of IoTA. But, … already got stuck when considering how to (best?) model the architecture at lower levels. Would a classical picture, or a somewhat-less classical picture work best to gain understanding of the risk areas ..? As in:
Industrial control cycle
[Own pic]
Or
open-standards
[Plucked, adapted from the site linked below]
Where the former is from the industrial, process-oriented engineering world, and the latter from the digital networking world.

Yes I’d really like your advice on how to ‘marry’ both to be able to optimally visualise where the risks are; the potential, intentional or not, noise on the signal, or the wrong signals altogether. What might cause that, how to protect against that, etc.
Yes, taking into account the work already done here – which is impressive, but somewhat (?) protocols-oriented, not architecture-/risk-oriented. Yet. Something like
SCADASmartGridEfficacy_Page_2_Image_0002
[plucked off a simple search] is what I’m after.

But the other work, too. All, to overlay with risk lists on all surfaces at all levels… Then, to see how to protect that all against the (generic?) risks, and how one would audit sufficient (?) protection is in place. Not ‘controls’ – those are the losers’ weak retreats, the “didn’t want a cookie anyway” fig leaves. Taking into account this breakthrough though.
But for now, again already, leaving you with:
DSCN2075
[Life in stead of straight angles, Barça]

Maverisk / Étoiles du Nord