From Terminals to Data Infrastructure

Diving deeper into

SVP of Technology & Product Strategy at FactSet on driving trust through auditability

Interview
Firms also are becoming more sophisticated and wanting to build their own solutions.
Analyzed 5 sources

This shift turns financial data vendors from terminal sellers into infrastructure suppliers. The most advanced buy side and banking firms increasingly want raw data, APIs, and ready made components they can plug into their own research tools, models, and AI agents, instead of forcing analysts to live inside one desktop. That is why cloud delivery, embedded charts, and source linked AI matter as much as the underlying dataset.

  • In practice, building in house means a hedge fund or bank pipes FactSet data into Snowflake, AWS Data Exchange, or its own apps, then layers internal screens, models, and copilots on top. FactSet supports this with Open FactSet, data feeds, APIs, Cornerstone, and marketplace distribution.
  • The build trend is strongest at the largest firms, but it has limits. Many clients wanted custom platforms, then found that hiring and keeping software and machine learning engineers was harder than expected, so they moved toward a mix of feeds, APIs, and embedded reporting instead of fully custom stacks.
  • This also explains why the competitive battleground has moved beyond owning a proprietary terminal. FactSet is investing in open delivery and auditable AI so clients can pull trusted data and source linked answers into their own workflows, while still buying packaged applications where custom engineering is not worth it.

Over the next few years, the winning model is likely to be hybrid. The biggest institutions will keep building internal research and agent workflows on top of external data pipes, while vendors that package the same data into APIs, cloud shares, and auditable AI components will capture more of the value chain than vendors tied mainly to a closed desktop.