Somehow, the many (?) Cassandra’s messages of late about Bitcoin being a hyped-up bubble about to burst and how it’s the laggerds (only) that now want to jump in, drive up prices but will be the Greater Fools in the game, sounds a bit … false bottom.
[Unsure what the exchange rate is, today. Has the bubble burst already, or ..?]
I’d say, how many countries, and moreover how great a many financial organisations (private or public, or regulatory or inter/supranational) may have an interest to blow up Bitcoin, as a demonstrator that all that blockchain stuff isn’t any good after all ..? Because the promise of blockchain was, and is, that it will not need any, traditionally somehow still geography-bound, standing governing body avalanched-under by politics, gross misunderstanding of the core concepts ruled over, etc. In short, power struggles.
And that, of course, may be a future that some will not allow. And fight to their deaths – inevitably, not by nature but by cause and earlier. But still they simply won’t have it, that blockchain/Bitcoin/smart-contract/whatevva shazam. With the only way to get rid of it in an inconspicuous manner, is to … inflate it till it goes borsht. Risk, but possibly profit hugely during the trip ..? Case in point: this one [hope the link is still valid, and it’s in Dutch which may mix semantic levels to be tauto]
[You get a golden bullet-like thing on your pillow. With compliments and filled with chocolate, that is… In Salzburg of course]
Today being the day of the real Sinterklaas (as here), we also learn (not!) to cherish the small gifts. As in this effect, but otherwise. [That sounds weird…]
But the effect is too, all the same. And so ubiquitous that we may even lose sensitivity towards, or against [here we go again] it. And that is a problem. Because darn, how can we even think to train AI in rational business decision making, when all learning examples, and/or actual practical deployment, will be tainted / rife with such irrational biases? We so commonly swipe, shuffle them softly, under the rug that here we have such an AI-applicability-wrecking issue that we hear so much about lately. [No you don’t, compared to what would be enough, hardly anything at all…]
Another example of knowing your classics, eh? Oh well. Plus:
[Noooo, not those classics again! Saltzb’]
If this (link in Dutch) is the state of the profession, then we’re all doomed. Luckily, the players in this sham [that’s putting it mildly, 007; ed.] will be deleted from history first. Sorry, not luckily; hopefully. Since the comparisons they make, and the judges’ explanations, are so utterly stupid that one can hardly see them function normally in regular society. Can’t sugarcoat this.
Those hat apply the law, aren’t above it I hope. Let appropriate parties get them, before they destroy communities and common sense.
Oh well. And:
[The circus is where such people were put on display, then the delusional got control; Zuid-As Amsterdam]
The ‘terrible’ news (not) that Flash is about to be abandoned by one of its last if not the last pillar of support, reminds me of similar ‘developments’ of the past. Like, where did Dynamic HTML go ..? DEC, Sun (Sparc), Compaq, WordPerfect, Norton Utilities, 9-pin matrix printers, bulletin boards, portals. Etc.etc. Yes, yes, I know, some are still around, like OpenVMS is. And in software in particular, there may be many, many more of the lost ark items – where I’d like to see more focus on. Are they valued enough, for their staying power ..? Isn’t their staying a bit exasperated, in some dark corners of the usage landscape ..?
But more importantly (it is); is there some museum or so out there that preserves them for prosperity? I don’t mean just any ‘computer museum’ as they are (all?) of the scattershot type. I mean some museum that captures most of the essentials of the already many eras past, in IT. Like What the Dormouse Said is on paper, but then in software, running, and presenting systems as end users would experience them, a decade, two decades, -plus, ago. Without smartphones, without fastest Internet let alone actually working WiFi.
Edited to add, before scheduled posting: This, on a farewell to ‘screen savers’.
So, if you’d have some pointers, please..?
[Edited to add: A chunk of the above, here.]
Thanks in advance through:
[Once (??) was modern; Madrid]
When considering the senses, is it not that Visual (having come to ther play rather late in evolution or has it but that’s beside the point, is it?) has been tuned to ultrahigh-speed 2D/3D input processing ..? Like, light
waves particles who are you fooling? happen to be the fastest thing around, qua practical human-scale environmental signals – so far, yeah, yeah… – and have been specialised to be used for detection of danger all around, even qua motion at really high pace (despite the 24-fps frame blinking).
Thus the question arises: What sense would you select, when focusing on shallow processing of the high-speed response type? Visual, indeed. Biologically making it less useful for deep thought and connection, etc.
Now that the world has turned so Visual (socmed with its intelligence-squashing filters, etc.; AR/VR going in the same direction of course), how could we expect anything else that the Shallows ..? Will we not destroy by negative, non-re inforcement, human intelligence and have only consumers left at the will of ANI/ASI ..?
Not that I have the antidote… Or it would be to Read, and Study (with sparse use of visual, like not needing sound bite sentences but some more structured texts), and do deep, very deep thinking without external inference.
But still… Plus:
[An ecosystem that lives off nanosecond trading – no need for human involvement so they’re cut out brutally; NY]
Yet another proof class busted: Voice being (allegedly) so pretty perfectly synthesizable, that it loses its value as proof (of identity). Because beyond reasonable doubt isn’t beyond anymore, and anyone venturing to bring voice-based evidence, will not be able to prove (beyond…) that the sound heard, isn’t tampered with i.e. generated. Under the precept of “whoever posits, proofs”, the mere remark that no madam Judge we honestly did not doctor this evidence, is insufficient and there can be no requirement for positive disproof for dismissal from the defense as that side is not the one doing the positing. What about entrapment, et al.?
So, technological progress brings us closer to chaos. “Things don’t move so fast”-believers must be disbarred for their demonstrated gross incapacity — things have moved fast and will do so, ever faster. Or what ..?
Well, or Privacy. Must the above ‘innovator’ be sanctioned severely for violation of privacy of original-content-sound producers ..? Their (end) product(s) is sold/leased to generate false identity or doctored proof, either for or against the subject at hand, <whatever> party would profit thereof. Like an equipment maker whose products are targeted at burglars, or worse e.g., guns. Wouldn’t these be seriously curfewed, handcuffed ..?
[Edited to add, after drafting this five days ago: Already, Bruce is onto this, too. Thanks. (Not my perspective, but still)]
[Apparently so secure(d), ‘stormed’ and taken practically overnight (read the story of); Casa Loma, Toronto]
When the recent rumours were, are valid that some patches were retracted — and this was because they accidentallt disables other exploits not yet outed in the stash, this would bring a new (?) tension to the surface or rather, possibly explains some deviant comms of the past:
Where some infosec researchers had been blocked from presenting their 0-day vulns / exploit-PoCs, this may not have been for protection of the general public or so, but to keep useful vulnerabilities available for the TLAs of a (variety of?) country(-ies).
Pitting the Ethical researchers against the bad and the ugly…
No “Oh-oh don’t give the bad guys valuable info and allow even more time to the s/w vendors to plug the holes” but “Dammit there go our secret backdoors!”
Makes much more sense, to see the pres blocking in this light. And makes huge bug bounties by these TLAs towards soon to be a bit less ethical researchers, more possible and probable. Not as yet better known, though. Thoughts?
[Takes off tinfoil movie-plot security scenario hat]
[All looks happy, but is looked upon from above …; Riga]
Not in any economic sense you may have thought, given the attention oft given to, e.g., the 1% or 99% (We Are-; Occupy-style) where now the 90% might be the disappeared middle class in the US that extended from the bottom 10% – that was around even in the best of times – all the way to the top — excepting the 0.01% that was in charge all the time …
Here, it’s about a quote slash truism:
90% of everything is crap
Have ever truer things been said. This, of course you knew since prep school, being Sturgeon’s Law.
Just putting it there. See the link for a ‘proof’. Or look around you; physically (co-workers), mentally (in your head, and feel free to assume the others’ heads are not necessarily better…), qua your pay check, your significant other [hey here I can testify I’m lucky with a not-90% specimen par excellence; no she’s not reading this], etc.
Leaving you with:
[In the 10%, definitely. Even when it rains, this one. Baltimore]
As we’re nearing the end of the year (Western calendar, others not spoiling the party — learning point), we draw towards the ‘people being stupid with fireworks’ scenes that are oh so similar to ‘people managing systems’ situation. The former, focusing on the most beautiful display and/or the loudest Bang, the latter the same if you think of it.
The former, with latent recognition of ‘safety’ also re bystanders and collateral injuries possibly grave or life-, liberty- and happiness-threatening. The latter, with a desperate few considering ‘security’ and ‘privacy’, a even fewer thinking of collateral damage and implicit injuries and infractions to life, liberty and happiness — if you think that’s overrated, have you ID stolen.
The former has the Darwin Awards, for those that improve the gene pool by taking themselves out of it.
The latter, none such yet.
That’s where I aim:
Shouldn’t we instate the CyberDarwin Awards (acknowledging #ditchcyber), for the most egregious (i.e., outrageous, glaring, flarant) mindlessness in information security in the widest sense that fly in the face of basic common decent thinking?
So that by their occurence, the candidates volunteer to be taken out of the connected environment which, being their oxygen, improves what’s left (the most).
I have no idea how to pull this off; there should be some sort of portal where candidates may be proposed and results be displayed for common laughter but who will build and maintain such a thing before it can become a success, advertisers will flock in droves to sponsor for ads, and I take over again to reap all the financial benefits… #helpappreciated
[This has zero relevance. Toronto]
Those were the days, when knowledge elicitation specialists had their hard time extracting the rules needed as feed for systems programming (sic; where the rules were turned into data, onto which data was let loose or the other way around — quite the Turing tape…), based on known and half-known, half-understood use cases avant la lettre.
Now are the days of Watson-class [aren’t Navy ships not named after the first of the class ..?] total(itarian) big data processing and slurping up the rules into neural net abstract systems somewhere out there in clouds of sorts. Yes these won out in the end; maybe not in the neuron simulation way but more like the expert system production rules and especially axioms of old. And take account of everything, from the mundane all the way to the deeply-buried and extremely-outlying exceptions. Everything.
Which wasn’t what experts were able to produce.
But, let’s check the wiki and reassure ourselves we have all that (functionality) covered in “the ‘new’ type of systems”, then mourn over the depth of research that was done in the Golden Years gone by. How much was achieved! How far back do we have to look to see the origins, in post-WWII earliest developments of ‘computers’, to see how much was already achieved with so unimaginable little! (esp. so little computing power and science-so-far)
Yes we do need to ensure many more science museums tell the story of early Lisp and page swapping. Explain the hardships endured by the pioneers, explorers of the unknown, of the Here Be Dragons of science (hard-core), of Mind. Maybe similar to the Dormouse. But certainly, we must lament the glory of past (human) performance.
[Is it old, or (still) new ..? Whatever, it’s prime quality. Spui, Amsterdam]