We work hard to maintain a list of standardized fields that are available to our customers. However, it is important to keep in mind that users can come across data in NTerminal that is not documented. Please see the Data Access pages for additional details.
The rapid evolution of project development across the digital asset ecosystem outpaces any single company’s ability to properly document data as it is created. More often than not, developers make their products available before writing a manual on how to use them. We built NTerminal to keep up with constantly changing data stream formats to be able to capture data and deliver valuable information to our users as quickly as possible. The ability to consume large amounts of poorly structured data is what sets Big Data Frameworks like NTerminal apart from the walled garden approach to data analytics of the past. Where conventional data platforms rely heavily on API specifications and data reliability from upstream providers, NTerminal can adapt to any external dependency through heuristic based standardization and error handling. On top of robust data integration, NTerminal also provides powerful tools for users to manipulate incoming data streams and extract valuable information without any coding or development skills required.