Honestly, the rabbit-holes a single word can take you down. Like bits of grit in my eye (ear?), any unfamiliar coinage I find impossible to resist – especially when a word which begin life as a total fiction, somehow sticks (like ‘chortle’, or ‘meme’); and more especially when they attach themselves (well, find attachment) to specific industries.
Today’s little bundle of transposable meaning came via a paper on machine deep learning, where researchers appear to conclude that the certain algorithmic models
“will initially memorise the training data, but after a long time will suddenly learn to generalise to unseen data.”
Or begin to grok.
Intrigued, I had to look it up and when you’re grokking you’re understanding thoroughly and intuitively. So the learning machines ’show’ signs of intuitive thinking and so respond accordingly. More worrying, it also pins my design process, but that’s more likely wishful thinking. My process is more akin the blundering about.
What’s really odd is that the made-up word (from Robert A. Heinlein’s 1961 science fiction novel ‘Stranger in a Strange Land.’ Which was also turned into a pop-tastic tune by the English Heavy Rock combo, Iron Maiden. How culture feeds on itself is a marvel) was supposed to be Martian (as the protagonist was from Mars.)
So via science fiction we find ourselves using language to try and define actual science exploration.
Anyway, I’m quietly hoping the grok here turns out to be a false positive, and that the AI in question actually only appears to display the particular attribute: i.e. sentience.
Maybe time to call to science’s investigative powers rather than inference – The First Law of Experiment Design: you are not measuring what you think you are measuring.
Second Law of Experiment Design: if you measure enough different stuff, you might figure out what you’re actually measuring. Might turn out to be a load of old grok.