Sometimes We Wonder What Intelligence is Artificial
“If a machine is expected to be infallible, it cannot also be intelligent.”
- Alan Turing
“Shall we play a game?”
- Joshua, War Games
Total aside that will become common in this space. We’ve been out camping to celebrate our 28th anniversary. How the heck she’s tolerated me that long is a story in itself, THE COMPANY only stood me for 26.5!
We were short on a bottle for the pre-dinner drink. Thirty minutes to the closest liquor store (relax, we needed groceries, too), and we found a Rebel 100 (Lux Row Distillers, Bardstown, KY). It’s made using a traditional wheated recipe. The correspondent and the lovely bride both have a fondness for the grassiness wheat adds, and there’s a little beyond-bourbon sweetness that you normally don’t get in a 100 proof taste. The smokiness might come from the barrel, or it might be that the AQI is well into the 200 range because half of Oregon is on fire. Anyway, cheers.
Your Humble Author (YHA) has a fascination with the stock market, mostly because compound interest provides to us, the willfully underemployed. One particular amusement these days is watching all the little Whos in Whoville screaming, “But we have AI!” when being dragged to the dust speck boiler of earnings results. First off, a quarter of the time it’s not AI, stop lying to your marketing department. We get that Machine Learning is esoteric and hard to explain, but it also generates results you can audit. Another quarter involves using someone else’s chatbot to produce not very meaningful results. That remaining half? The development department was looking for some “me time” and figured that a 60% accuracy rate was better than the dartboard they usually get from upper management, so they did an implementation hoping to get Friday afternoons off. There’s some good, but there’s a lot of not good.
AI has been around for a long time, the Turing Test is named after a really smart guy who passed in the 1950’s, after all. Some of us would probably say that the current rush on AI is due to the accumulation of data, but that’s a little misleading. There’s always been data sloshing around the world. It’s just that very little of it was in the right place, and it certainly was not in the right structure to be scrutinized. If some government-funded project wanted to study global temperatures, it had to do the work of getting the data from every ledger to one computer storage spot, in one format, with some serious programmers to write serious code to produce meaningful results.
The PC/Internet revolution of the 90’s started the ball rolling on creating a “place” for everything. After a couple decades of upper management demanding that everything go up in the cloud, we finally solved the access problem, but not the structure one. If that now bigger government-funded project wanted to take a look now, it’s not only got the old temperature data, but access to a billion backyard weather stations, all the articles about the prior art, and a serious amount of social media memes vamping on a repetitive Nelly tune. So now it’s down to figuring out how to access all the slosh in that giant container.
A more relevant aside than usual: YHA remembers being at an IBM Conference a few years back where a Watson executive told the audience that they were tracking potential flu epidemic outbreaks in part by monitoring Twitter for people posting the equivalent of, “I feel like crud today.” They were finding outbreaks about ten days faster than the CDC who was tracking their standard measures like hospital data and pharmacy orders. So this AI stuff CAN work if you know what you’re doing.
Anyway, back when we were all calling it Big Data ten years ago, the structure problem was still a limiting factor in the analytics discussion. We were all about the V’s: Volume, Variety, Velocity. The real winning formula in analytics at that point was addressing the Variety by referencing more and more data types in their natural form, which eliminated the need for costly (money and time) Extract/Transform/Load (ETL) routines. We were right back to those serious programmers, serious code, meaningful if directional results.
Artificial Intelligence is a fundamentally lazy approach to the data problem. Lest you think that’s a condemnation, any good strategist will tell you that the best strategies are ones that are lazy, the hard work is in the tactics. The basic approach of AI is to throw it at all the data and see what patterns it finds. Structure is less of an issue when your algos are seeking repetition independent of prior relevance. So all those companies claiming that AI will solve your problems are likely correct in the long term. Except…
All you data nerds have been thinking: “Dude, you forgot a V a couple paragraphs ago.” You’d be correct. As the data in the cloud continued to grow, we added Veracity to the equation. The sober geeks out there are still sifting the data before they attack it with AI, knowing that garbage in still produces garbage out. Those less experienced, or less aware that tactics are where the hard work hits, are likely to slow their actual decision processes by getting inconsistent results. If you’d like a practical example, hit up any chatbot or AI search engine with a conspiracy theory that goes against your grain, and discover the chaos it creates.
So financial results won’t actually just come at the behest of, “We have AI!” They’ll come when companies build in the tools and training to sort the data and make the sober decisions. We’ll all be able to see this shake out in front of us, even as the new types of data hitting the cloud create a need to do it all over again in a more complex way. As always when dealing with your own personal compounding, caveat emptor, and happy hunting.
“If a machine is expected to be infallible, it cannot also be intelligent.”
- Alan Turing
“Shall we play a game?”
- Joshua, War Games
Total aside that will become common in this space. We’ve been out camping to celebrate our 28th anniversary. How the heck she’s tolerated me that long is a story in itself, THE COMPANY only stood me for 26.5!
We were short on a bottle for the pre-dinner drink. Thirty minutes to the closest liquor store (relax, we needed groceries, too), and we found a Rebel 100 (Lux Row Distillers, Bardstown, KY). It’s made using a traditional wheated recipe. The correspondent and the lovely bride both have a fondness for the grassiness wheat adds, and there’s a little beyond-bourbon sweetness that you normally don’t get in a 100 proof taste. The smokiness might come from the barrel, or it might be that the AQI is well into the 200 range because half of Oregon is on fire. Anyway, cheers.
Your Humble Author (YHA) has a fascination with the stock market, mostly because compound interest provides to us, the willfully underemployed. One particular amusement these days is watching all the little Whos in Whoville screaming, “But we have AI!” when being dragged to the dust speck boiler of earnings results. First off, a quarter of the time it’s not AI, stop lying to your marketing department. We get that Machine Learning is esoteric and hard to explain, but it also generates results you can audit. Another quarter involves using someone else’s chatbot to produce not very meaningful results. That remaining half? The development department was looking for some “me time” and figured that a 60% accuracy rate was better than the dartboard they usually get from upper management, so they did an implementation hoping to get Friday afternoons off. There’s some good, but there’s a lot of not good.
AI has been around for a long time, the Turing Test is named after a really smart guy who passed in the 1950’s, after all. Some of us would probably say that the current rush on AI is due to the accumulation of data, but that’s a little misleading. There’s always been data sloshing around the world. It’s just that very little of it was in the right place, and it certainly was not in the right structure to be scrutinized. If some government-funded project wanted to study global temperatures, it had to do the work of getting the data from every ledger to one computer storage spot, in one format, with some serious programmers to write serious code to produce meaningful results.
The PC/Internet revolution of the 90’s started the ball rolling on creating a “place” for everything. After a couple decades of upper management demanding that everything go up in the cloud, we finally solved the access problem, but not the structure one. If that now bigger government-funded project wanted to take a look now, it’s not only got the old temperature data, but access to a billion backyard weather stations, all the articles about the prior art, and a serious amount of social media memes vamping on a repetitive Nelly tune. So now it’s down to figuring out how to access all the slosh in that giant container.
A more relevant aside than usual: YHA remembers being at an IBM Conference a few years back where a Watson executive told the audience that they were tracking potential flu epidemic outbreaks in part by monitoring Twitter for people posting the equivalent of, “I feel like crud today.” They were finding outbreaks about ten days faster than the CDC who was tracking their standard measures like hospital data and pharmacy orders. So this AI stuff CAN work if you know what you’re doing.
Anyway, back when we were all calling it Big Data ten years ago, the structure problem was still a limiting factor in the analytics discussion. We were all about the V’s: Volume, Variety, Velocity. The real winning formula in analytics at that point was addressing the Variety by referencing more and more data types in their natural form, which eliminated the need for costly (money and time) Extract/Transform/Load (ETL) routines. We were right back to those serious programmers, serious code, meaningful if directional results.
Artificial Intelligence is a fundamentally lazy approach to the data problem. Lest you think that’s a condemnation, any good strategist will tell you that the best strategies are ones that are lazy, the hard work is in the tactics. The basic approach of AI is to throw it at all the data and see what patterns it finds. Structure is less of an issue when your algos are seeking repetition independent of prior relevance. So all those companies claiming that AI will solve your problems are likely correct in the long term. Except…
All you data nerds have been thinking: “Dude, you forgot a V a couple paragraphs ago.” You’d be correct. As the data in the cloud continued to grow, we added Veracity to the equation. The sober geeks out there are still sifting the data before they attack it with AI, knowing that garbage in still produces garbage out. Those less experienced, or less aware that tactics are where the hard work hits, are likely to slow their actual decision processes by getting inconsistent results. If you’d like a practical example, hit up any chatbot or AI search engine with a conspiracy theory that goes against your grain, and discover the chaos it creates.
So financial results won’t actually just come at the behest of, “We have AI!” They’ll come when companies build in the tools and training to sort the data and make the sober decisions. We’ll all be able to see this shake out in front of us, even as the new types of data hitting the cloud create a need to do it all over again in a more complex way. As always when dealing with your own personal compounding, caveat emptor, and happy hunting.