A 3-Part Series: Thriving Across the Worlds of Finance, Business, Data, and Technology
Part 1: Framing the problem - why silos are not viable anymore
Welcome to the Data Score newsletter, your go-to source for insights into the world of data-driven decision-making. Whether you're an insight seeker, a unique data company, a software-as-a-service provider, or an investor, this newsletter is for you. I'm Jason DeRise, a seasoned expert in the field of alternative data insights. As one of the first 10 members of UBS Evidence Lab, I was at the forefront of pioneering new ways to generate actionable insights from data. Before that, I successfully built a sell-side equity research franchise based on proprietary data and non-consensus insights. Through my extensive experience as a purchaser and creator of data, I have gained a unique perspective that allows me to collaborate with end-users to generate meaningful insights.
Finance/Business Professional: “This isn’t what I asked for.”
Data/Technology Professional: “You’re moving the goal posts.”
Finance/Business Professional: “It should be easy to make this improvement.”
Data/Technology Professional: “This will take many more months of work and still might not be possible.”
Finance/Business Professional: “This is not usable as is.”
Data/Technology Professional: “We wasted our time on this.”
Both: “How did this happen?”
These one-liners back and forth could be from any data project that has broken down. While I’m not quoting any specific conversation, I suspect many readers of The Data Score are filling in the blanks with their own past difficult experiences when stakeholders across finance, business, data, and technology realized they were not actually aligned… Unfortunately, much time and hard work have passed. This scenario speaks to the universal nature of this occurrence across business, finance, data, and technology professionals trying to make an impact.
In this multi-part entry in The Data Score newsletter, we’re going to explore how to break down the silos between data, technology, and business to more easily get to the good part of a data insights practice: actionable, accurate decisions.
Part 1: Framing the problem: why silos are not viable anymore.
Original silos and problems to be solved are mostly self-contained.
Problem complexity is accelerating.
Part 2: What goes wrong when the worlds collide?
Talking past each other
Smartest person in the room syndrome
Fear of failure
Part 3: How to break down the silos and align around outcomes
Build empathy.
Create proactive rituals.
Top-down culture
What I hope to offer here is a benchmark of the ideal for setting up the best possible outcomes when the worlds of business and finance come together with the worlds of technology and data.
I’ve seen the good, the bad, and the ugly in my experience working across all these worlds. But I want to be super clear on this: I don’t have all the answers. I’m human, so I make mistakes too. I can’t promise that I’ve been perfect at following these ideals in the past. Nor can I promise that in the future there will not be times when communication will break down while I’m involved in the project.
The benchmark I put forward is to stimulate conversation. I’m keen to hear how everyone who reads this newsletter feels. Finance professionals, business leaders, data professionals, and technologists: how do you break down these silos to generate positive outcomes, even in difficult situations?
Part 1: Framing the problem: why silos are not viable anymore
A review of prior silos
Let’s go back 25 years to see the old silos of the financial world, data, and technology.
Data and technology
Until recently, technologists and data experts tackled straightforward problems with limited outcomes. In the not-so-distant past (25 years), technologists and data experts worked to solve “straight-forward” problems with a limited array of outcomes. For example:
Train a computer to defeat the world chess champion (Deep Blue vs. Kasparov in 1997).
Enable real-time e-commerce transactions by connecting product databases to order execution and shipping systems.
Build a search engine for DVD rentals on Netflix.
Generate printable turn-by-turn directions between locations entered by the user.
Yet tasks have become more complex and nuanced over time.
Recommendation engines for suggesting products to purchase or movies to watch have continued to develop over the last 10 years.
Waze’s software was able to adjust travel directions in real time based on the position of the vehicle and the traffic caused by other vehicles, as well as provide real-time alerts.
In 2016, AlphaGo, developed by DeepMind, defeated the world champion Go player, Lee Sedol.
Technology continues to assume more advanced, high-value tasks previously performed by humans.
Exponential growth
The “easy and obvious” wins have been captured, and ahead are more complex problems to solve with data and technology. To be fair, this is always true with technology. No matter where you are on the exponential curve, the past looks easy, and the future looks impossible. Those “easy and obvious” wins weren’t actually “easy and obvious” in the late 1990s and early 2000s.
So what is different now?
The difference is that the application of data and technology is moving further up the value curve, requiring more expertise from multiple domains to come together to make complex decisions in real time, leveraging more advanced technology.
Borrowing examples from the auto industry, the challenge is no longer as simple as automatically activated headlights in darkness (which at the time they were introduced was pretty cool tech). Now, we are grappling with autonomous vehicles navigating left turns through busy (US) intersections on a green light, with oncoming traffic rushing to get through before the light turns red, and avoiding the bicycles and pedestrians who are ignoring the traffic rules. This complex problem features unpredictable inputs and tremendously high stakes for failure.
Financial markets silos
In the not-so-distant past (20–25 years ago), a senior investment professional was armed with a phone, financial and market data, Excel, brilliant associates, and their own experience to make sense of the available information and what it meant for financial markets and their investments.
As the volumes of information and data grow exponentially, the ability to process and understand the greater meaning and implications of the information increasingly requires expertise in data and technology to unlock the goal of data-driven insights.
The financial market world was quick to adopt technology related to
The more precise nature of executing transactions with speed and accuracy.
Monitoring portfolio returns and risk.
However, there has been generally less investment in technology to improve the ability to make longer-term predictions beyond the success of a trade or provide high-impact advice and recommendations. The human is still driving the proverbial car, with some automated aspects of driving.
Corporate decision-making is still human-led
Similarly, in the corporate world, business intelligence practices helped provide more data and insights on how the business was operating and assessing the performance of decisions. However, decision-making is still mostly done by humans.
The investment decision process, whether it's the buyside, sellside1, or C-suite2 of a corporation, is inherently still human, with some automation to help the process.
But the decision-making ability of software is allowing for more high-value decisions to be automated, which is pushing the worlds of business, finance, data, and technology closer together.
A collision course between worlds in fundamental analytics
The world of fundamental analysis3 in financial markets, data, and technology have been on a constant path of integration over this period.
By the mid-2000s, some advanced analytics leveraged data and technology to gain proprietary investment insights.
Early geospatial analysis of retail trade areas used Microsoft Access to calculate distances between locations.
Early transaction data analytics utilized point-of-sale data to assess Consumer Packaged Goods (CPG) pricing and competition.
Initial web scraping projects relied on VBA4 scripts.
In addition, quantitative research continued to build momentum in practical use by leveraging large amounts of financial data to explain and predict periods of alpha generation.
However, the majority of the financial markets continued to operate the way they always have, with bright people synthesizing information into an Excel-based financial model.
The evolving nature of financial markets may explain why "Bright People + Excel" endured over data and technology solutions. It’s a never-ending puzzle with no endgame or finite set of outcomes. Just when one question is answered, a new question appears with the answer, further adjusting the price of the investment asset.
Despite the open-ended, never-ending mystery of the market, the gap is closing rapidly between technology’s capability and the financial market outcomes needed. But it still needs a human in the loop.
And as more focus is placed on data and technology in the broader world, there is also pressure on businesses to have an appropriate technology and data strategy to keep up. Currently, executive boards across industries are scrambling to answer what their AI strategy is.
Vin Vashinshta's Substack, Courses, and new best-selling book are great resources:
Different worlds of complexity and high-impact skills
This is creating a high-risk environment where business, data, and technology intersect. Crafting an investment recommendation relies on constantly evolving contextual factors across domains.
A personal recommendation for a movie to watch this weekend is somewhat of a fixed outcome. Netflix doesn’t need advanced data science wizardry to figure out that when a new Adam Sandler movie comes out, it's a 99% match with my viewing history and should be recommended to me. This problem leverages a fixed system with all the puzzle pieces needed, including a huge sample size to draw from across many users like me.
By contrast, the financial markets are infinitely open because of how new information is discovered and reflected in the share price of an asset. In the corporate world, decisions lead to reactions by competitors, regulators, and customers, which in turn lead to new decisions.
This poses risks as few individuals expertly span institutional investing5, machine learning models, processing technologies, and data inputs.
A typical data scientist doesn’t have the subject-matter expertise to understand the nuances of the financial markets, intuitively understanding why a high-quality company can underperform the market or why a company can see its share price rise even if it missed consensus earnings expectations6.
A typical financial market professional doesn’t understand why machine learning models could be overfit… but can be very motivated to use them in practice by seeing charts that show amazing back-tested alpha7.
However, issues arise when the model is put into production, and despite indicating significant opportunities relative to consensus, investments are made without subjecting the model to rigorous scrutiny. This is when significant losses occur.
Building bridges between the worlds of business, data, and technology to increase the overlap is critical to maximizing the return of a data and technology strategy by business decision makers. But first, let’s diagnose where the relationship goes wrong, which we will do in Part 2 next week.
Ahead of next week’s newsletter: Data Score Readers, chime in with the challenges you’ve seen when Business, Finance, Tech, and Data experts come together to work on a product or project?
Links: Part 1 | Part 2 | Part 3
- Jason DeRise, CFA
Buyside vs Sellside: Buyside typically refers to institutional investors (Hedge funds, mutual funds, etc) who invest large amounts of capital and Sellside typically refers to investment banking and research who provide execution and advisory services to institutional investors.
C-suite: Top executives at a company - CEO, CFO, COO, etc.
Fundamental analysis: Assessing investment assets based on underlying economic and financial factors.
VBA": Visual Basic for Applications. Programming language embedded in Microsoft applications like Excel.
Institutional investing: Asset management and investing on behalf of clients by firms like mutual funds, pensions, endowments. (aka the Buyside)
“The Consensus” is the average view of the sell-side for a specific financial measure. Typically it refers to Revenue or Earnings Per Share (EPS) but can be any financial measure. It is used as benchmark for what is currently factored into the share price and for assessing if new results/news are better or worse than expected. However, it is important to know that sometimes there’s an unstated buyside consensus that is the better benchmark for expectations.
Alpha: A term used in finance to describe an investment strategy's ability to beat the market or generate excess returns. A simple way to think about alpha is that it’s a measure of the outperformance of a portfolio compared to a pre-defined benchmark for performance. Investopedia has a lot more detail https://www.investopedia.com/terms/a/alpha.asp