In line with various posts (like, this one of this morning, and another one somewhere this week – Thursday; of course I know, it had been scheduled two months ago),
an in-between remark of sorts: The biggest bottleneck in AIland nowadays is the translation of some unknown business need into AI/ML sprint objectives.
See? One can have a lot of data, and randomly figure out the patterns (‘correlations’ of various kinds; note the ”) OR go for the outlier detection. Then what? Is there a business purpose to turn the found patterns (and keep on trucking learning – using the same stats [!] engine) into some system module or so? Is the business case sufficient, and do all involved know enough about each other team mates’ territories to get it together? How much work needs to be done to turn the stats engine part into something efficient, compiled and adaptable through continuous learning? This wrangling with the systems logic at a (?) meta-levels may render the system less effective i.e. business case worthwhile; what’s the minimum and can we achieve that?
Or will the point application remain that, and not even be turned into a fully developed little app ..? [Where has the original ‘applet’ gone ..?]
So, what would happen if after a lab pilot, a full business case were developed ..? I could go on and on about how a demand pull should drive development, not a supply shy little push’let by a breaking-voice nerd… [no offense intended]
Oh well, this:
[A design trick looking for an application; Zuid-As Amsterdam]