When companies begin their search for a data historian or a time series database for storing large amounts of process data, they often evaluate solutions on their features and offerings. Rather than ask the correct questions, they mistakenly structure their RFIs and requirement lists around these three areas:
- Data connectivity - systems they want to collect data from or serve data to
- Performance benchmarks - capabilities to write and read data quickly
- Reporting tools - dashboards, trending packages, visualizations
However, starting with a list of desired features is a mistake. Before organizations focus on these items, they need to evaluate the enterprise data historian vendor on their commitment to openness, system security, and overall adaptability.
THE FREEDOM OF OPEN
When we describe a data historian as open, do we mean open source? No, not at all. Open source speaks only to the general availability of the source code being open for anyone to inspect, modify, or enhance. Openness represents a vendor’s commitment to provide open and unlicensed methods to move data into and out of their solution. This is best, and most often, represented by providing paths to migrate data using standard and accepted industry protocols. In Canary’s case, this is done by supporting data logging from sources like OPC and MQTT servers, SQL databases, CSV files, APIs, and other formats.
It is not enough to simply embrace openness on the ingestion of data. Platform lock is a common problem within our industry. Data is held captive in systems and requires complex development work or expensive licensure to free it. Focusing your attention on how you will share information from your enterprise data historian with other third party applications is crucial. Canary believes that you should never have to license an SDK or toolkit to query data from the Canary System. Therefore, you may read or publish data via APIs, MQTT, CSV export, OPC HDA, or JSON over WebSockets with no licensing requirement.
SECURE AND TRACEABLE
All companies desire system security, but rarely do they start with this principle during the evaluation of enterprise data historians. Within these database solutions, there are several areas of vulnerabilities that should be explored; the first few are found during the process of data logging. Questions that should be asked include: how are data logger connections initialized, are data logging sessions outbound only, does the historian have a way to authenticate and authorize the client, and what methodology is used for the movements of those data packets.
Once data is archived in the historian, evaluate the security of the database itself. Additionally, ensure there are electronic records and audit logs that provide for any type of modification to the data archive records and system configurations. Finally, identify when data is queried from the historian, what authorization and authentication steps are performed to evaluate the trustworthiness and appropriate permissions of the requesting client as well as the methodology for the transportation of that data.
Within the Canary System, these issues have been addressed by following standard best practices in addition to allowing administrators to apply the appropriate level of security for their operation. All data packets, whether in logging to or reading from the archive, are encrypted in travel using the latest available version of Transport Layer Security or TLS. Communication is always outbound initiated and whenever possible uses a single firewall port. An enhanced audit log provides system administrators with electronic records of configuration adjustments, data value overwrites, and other key changes. Additionally, both logging clients and reading clients can be required to authenticate and authorize using existing Active Directory or local Windows accounts.
Want to learn more about how the Canary System?
Join us for our weekly Thursday webinar.
Successful organizations understand that the only constant is change. This is true for not only our processes but our network architectures, information technologies, analytic requirements, and third-party partnerships as well. When properly deployed, data historians become a main data hub or data server, touching many different parts of your organization and serving many of your teams and applications. Failure to properly evaluate an enterprise data historian on its ability to adapt to your long-term needs will leave you frustrated and short-changed on your investment. When evaluating a data historian for adaptability, question the business model, development practices around the avoidance of technical debt, architecture flexibility, and deployment capabilities from the edge to the cloud.
Canary works hard to make sure our solution does not limit your productivity. We do not want to keep you from connecting to valuable data sources based on cost analysis so we have decided not to license our data collection software. The Canary development team constantly re-evaluates our solution, monitoring available technologies and deploying them when appropriate, always working to maintain backward compatibility without carrying over unnecessary technical debt. The end result is a solution that stands the test of time while providing a fresh user experience. To continue reinforcing adaptability, Canary components and services can be installed at various parts of your operation and are designed to be used on the edge, at site, in corporate data centers, or in private/public cloud solutions.
Once a solution passes the open, secure, and adaptable filter, you are now ready to shift the focus to questions on performance, reporting tools, and data connectivity. By placing these items secondary, you are ensuring that your investment will serve you well for many decades, grow as you need, and provide best-in-class security that goes beyond your IT requirements. Remember, features can always be added or improved, but the fundamentals of how a system has been designed is nearly always the limiting factor to your success.