Deep Nexus specializes in research and development (R&D) with a focus on sequential and time-series data. Our expertise lies in handling complex datasets, applying statistical models, and integrating automated systems.
- Financial Markets: Analysis of time-series data and development of trading algorithms.
- Healthcare: Medical datasets; identifying solutions that enable better outcomes for patients while filtering out cases where use of a predictive model is impractical.
- Other Industries: Our approach can apply to any number of use cases that involve time-series data, from IoT to environmental monitoring.
Deep Nexus applies the scientific method in the development and integration of its processes:
Financial markets are highly random but not completely random. Statistical methods can identify repeating patterns. Analysis of phase-space dynamics and Shannon entropy confirms this.
Nevertheless, these patterns can be difficult to extract due to non-stationarity and noise, both of which create problems for most inferential statistical methods. For example, if you wish to predict an asset’s future state, create a dictionary of patterns from market data and calculate their transition probabilities. The results will illustrate the challenge AI/ML faces in financial markets.
Our proprietary production system, known as Quantum, applies a statistically rigorous approach to uncover persistent, market-wide signals that are obscured in noisy, non-stationary price data.
Scaling Quantum primarily involves optimizing computational performance. Resources are directed at refining trade execution systems, rather than chasing elusive patterns in noisy data.
The system typically enters a combination of long and short positions simultaneously across a diversified ensemble of assets. It responds in real-time with intraday data to produce exceptional returns with low risk.
While Quantum does utilize Machine Learning (ML), it avoids the pitfalls associated with traditional alpha factor discovery and the limitations of many common ML approaches.
For example, alpha factors are often derived from price, fundamental, and alternative data, and are used to develop signals that predict asset price movements. As other market participants discover these factors and trade on them, their profitability is eventually diminished and the search for new factors becomes never-ending. Because Quantum exploits persistent behaviors in financial markets, its edge cannot be arbitraged away.
In recent years, advancements in Artificial Intelligence (AI) has opened new opportunities. Nevertheless, these models suffer from the aforementioned non-stationary of financial markets. Quantum’s approach addresses this issue in its data and signal processing layers.
Another challenge is most financial market optimization methods, including AI, have the tendency to overfit the model to its historical training data. As a result, the model’s performance quickly degrades in live trading.
This problem also applies to most portfolio optimization methods, including the Nobel Prize winning Modern Portfolio Theory (MPT) and its derivatives. Additionally, MPT requires accurately predicting future returns as part of the model’s assumptions and so it fails to live up to expectations in the real world.
Quantum avoids modeling assumptions that depend on predicting future returns. Rather, the modeling focus is on stable statistical relationships between assets.
To implement Quantum, a portfolio of at least 20 liquid assets is recommended, with the flexibility to scale further as needed. Suitable asset types include equities, futures, and tokenized assets, particularly via perpetual swaps.
The system benefits from periods of high volatility and leverage can be applied or adjusted based on the system’s trade signals. Trades are typically executed on an intraday basis; a quant prime broker is recommended.