In late August 2012, the Infocomm Development Authority (IDA) unveiled the latest edition of its Infocomm Technology Roadmap (ITR) to chart technology trends that will figure strongly in three to five years.
Computerworld Singapore is taking the opportunity to set the ITR as the foundation layer for its year-ahead feature. Heavyweights in the enterprise IT space are talking about their perspectives on the Roadmap; the industry developments and customer demands that they foresee happening in the specific themes that these technology giants operate in.
In the eighth part of a regular feature, Fortinet talks about its 2013 product and services roadmap, industry developments, customer demands and case study scenarios. The spokesperson is George Chang, Regional Director, Southeast Asia & Hong Kong, Fortinet.
The combined topics of Big Data and network security often produce two divergent schools of thought from IT professionals; categorical denial that Big Data should be treated any differently than existing network infrastructure, and an opposite response towards over-engineering the solution given the actual (or perceived) value of the data involved.
Without oversimplifying, the discussion can be divided into two discrete sections; securing Big Data itself and using Big Data as a tool for enhanced network and application security. Big Data compounds routine security challenges in existing data networks, both in the context of specific individual projects and the overall state of industry solutions. Big Data introduces for the first time, the opportunity to solve particular network and data security challenges which have previously been unrealistic due to computational power and correlation intelligence.
Access remains the most important consideration to ensuring the security of any data and the connecting networks. High performance firewalls and network security must be able to handle the increased throughput, connections and application traffic. Traditional secure access is often sufficient when analytics and data storage are collocated together. Where compute resources are remote from the data-store, higher performance security devices are required. Encryption at the scale required by big data remains challenging.
While secure access is critical, organisations should consider a number of components to securing huge volumes of data. Performance should be prominent. For Big Data to meet its expectations, security can't be the bottleneck. Policy creation and enforcement is critical as many people in an organisation will need access to different aspects of the data. Some people can add to the data, others may analyse, others need to move the data and resulting analysis.
Privacy of the data must also be addressed. Properly defined governance is essential to preventing incidents. While the data sets associated with Big Data are challenging to move, the valuable results and other analytic data are typically much smaller. Data leak protection technologies should be employed to ensure that this information is not being leaked to unauthorised parties. Internal intrusion detection and data integrity must be utilised to detect advanced targeted attacks that have bypassed traditional protection mechanisms, e.g., anomaly detection in the collection and aggregation layers. The inspection of packet data, flow data, sessions, and transactions should all be scrutinised.
Sign up for CIO Asia eNewsletters.