Build vs Buy: 6 Pitfalls to Avoid

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

Private market investors across the globe are asking themselves a similar question: “How do we transform our internal investment data, which is quite frankly a liability today, into an asset?”

Chief technology officers at these asset managers have countless ways to try to answer the question, depending on a host of factors. Where is the data coming from? How will the data be used? How can we preserve data integrity? How can we keep our data secure?

These questions inevitably lead to this commonplace conundrum for CTOs: Should we build our own solution or buy a data management platform?

Here at Mercatus, we know how we would answer that question but that’s beside the point. Instead, we’ll help you get to your own answer with some sage advice to fund managers or GPs who are evaluating an approach that involves piecemealing generic database and business intelligence point solutions, such as Snowflake and Tableau, to fulfill their needs. We find this trend to be alarming for a number of reasons:

 

Investment structures are getting more complex, and data integrations have to keep up.

Organizations across the corporate spectrum are dealing with a growing number of structured and unstructured data sources. Private market investors are no different. For an investor that deals in complex investment structures across different asset classes and geographies, a handful of power licenses to Tableau or PowerBI aren’t enough. In order to alleviate the manual data collection processes currently weighing down your team, you’ll need to consider the selection, implementation and ongoing maintenance of data distribution tools such as Matillion or Alteryx, which may run you additional costs in the tens of thousands.

 

Why hire analysts for data modeling when software can do it for you?

When was the last time you could update a Portfolio dashboard or Monthly Operating Report with just one source excel file? The reality is that your team’s analysts likely spend hours a week simply joining and transforming disparate data-sets to create meaningful reports for your management team. While today’s BI leaders such as Tableau and PowerBI offer some degree of data modeling capability, their product offerings are not performant when it comes to consistently and efficiently transforming the large volumes of data aggregated across your portfolios or connecting to diverse data formats. What this translates to is countless hours in just waiting for reports to load and even more in building and updating data models to account for changes to the data schema.

 

A web of disparate tools can’t provide data quality.

How often do you check a report, only to find that an investment multiple or IRR is wildly inaccurate? Yes, someone in accounting fat-fingered a number in an excel spreadsheet and now you’re going to look like an idiot in front of your boss or—even worse—an investor. The world has moved passed relying on manual, human checks on reports and datasets to ensure data quality you can trust. Unfortunately, a BI tool like Tableau will never give you peace of mind around the quality of your data—that’s because they don’t know your data! We’ve discovered there’s no one-size-fits-all data quality solution. Measuring, improving and finally trusting the quality of your data involves an in-depth understanding of specific investment data-sets, which generic tools just aren’t equipped to provide.

 

A piecemeal data management strategy is a consultant’s dream (and your nightmare).

It’s 11 PM and you’ve just realized that you need the IT department or the consultant you’ve hired to make an adjustment to a report you need to deliver tomorrow morning—great! The challenge with using individual point-solutions for Data Warehousing, Preparation & Distribution is that these tools, while powerful, aren’t designed for business users. An investment in a robust data strategy should enable, not hamper, your team’s ability to access and deliver the insights to drive critical, and time-sensitive investment decisions. A piecemeal data management strategy is a consultant’s dream—because it just means you’ll keep coming back for support and continuing to rack up those professional services hours.

 

Multiple tools will only exacerbate security risks.

In a world where data security and risk mitigation is a pillar of any successful data strategy, it is imperative to place robust and reliable frameworks around the permissions that drive a user’s ability to consume and edit not only raw data-points but also business logic. As soon as you bring multiple tools into your IT infrastructure, understand that ensuring data security becomes an unwieldy proposition. The alternative is providing your entire organization with access to the firm’s data—enough for any compliance officer to shake in their boots!

 

Don’t let disaster recovery keep you up at night.

We all like to think about the happy state—how will my data drive the performance of my funds to unprecedented heights? And that’s where your focus should be! However, the reality of today’s natural disasters, cyber threats, and fragility of file-based data transfers means that an organized and centralized data infrastructure must account for a strategy to retain, and recover from data corruption and loss in minutes, not days. It is imperative that your data strategy includes controls and procedures that ensure data reliability, stability, and resiliency to make sure that your critical business process can continue in the event of an inevitable breach, disaster, or bulk data ingestion error that leaves your dataset compromised.

 

Still not convinced? Contact us for a demo today.

 

VIDEO: Misconceptions surrounding data architecture

More To Explore