Natural language understanding (NLU) and processing (NLP) - the holy-grail subclasses of machine learning (self-/un-supervised deep learning in particular) - are still years behind human-level capabilities, but they are becoming faster and are scalable, unlike one person’s brain. Chips are getting cheaper, cloud compute to store and process data is getting more affordable, and scale and talent drive the data crunching these days, using tried and tested old algorithms, as well as new ones.
The number of data science jobs advertised at dinosaur banks and new fintechs are increasing too. It’s how Airbnb, Netflix, Google, etc get the edge - by working with lots of data since the beginning. Astronomers and physicists were kind of the original data scientists. It’s not uncommon to see many of them work at unicorn tech companies.
Identifying certain trends with a certain probability of X happening is kind of cool. And Google, Facebook, Alibaba, Twitter, university researchers, and many others have successfully picked up on what is going on by going through the news and social media chatter.
It is a fact that traditional equity/bond analysis is largely stuck in the 1950s, despite having the internet and spreadsheet modelling. If you have no access to UBS’s latest research on, say, Virgin Media, don’t worry. I have seen many of them and most of them suck and a lot of stuff get redacted before the final print anyway. That kind of research is not independent.
The good news is, despite the AI hype (beware of snake oil salesmen), new tools are popping up and over time they will get better, just like Google Maps is getting better at figuring out where you are as a blue dot (though I doubt Siri will ever improve ) :
Data-science companies like Arkera and New York-based Sigmoidal say they can solve this problem using machines that learn as they go to dredge through tens of thousands of news articles, government statements and social media accounts in languages as varied as Spanish, Arabic and Chinese.
“The system can give an edge over traditional analysts working for financial institutions,” said Bardonski, who said typical reports will include charts on sentiment, key word statistics and short written summaries. “Instead of getting 100,000 news articles, clients can get all the insights on one page.”
Note: beware of snake oil salesmen. Article from https://www.bloomberg.com/news/articles/2019-08-06/robots-are-solving-banks-very-expensive-research-problem
Note 2: DE Shaw, AIG, UBS, McKinsey types etc have been present at AI research conferences, as they, like Google and Facebook, are attempting to implement and profit from computer vision (e.g. how busy are parking lots at Walmart) and language processing techniques (e.g. earning call sentiment analysis) to get the edge in the fast moving world.
This does not mean common folks and amateurs have less advantage - we can be the tortoises and invest in the long-term. Plus, how many times has Wall St been wrong? For armchair analysts, either discovering tools that are cheaper of free is cool. Or, on top of studying financial analysis through e.g. Udemy, you can learn the data crunching techniques with Python or R and how to code, if you are curious enough. The world of open source software is huge and there are many tutorials on github, Medium and YouTube. Some companies take what is free, repackage it, then build walls around their products.