In this article, we are going to focus on the concept of information integrity, and more specifically, we will discuss the technologies around what is known as File Integrity Monitoring (FIM) and how a data-centric approach to protection can help us maintain it.
Table of Contents:
- What is File Integrity Monitoring or FIM?
- Why is a FIM system important?
- Information integrity: One of the objectives of an ISMS
- The CIA Triad
- Reactive vs proactive safety in the field of FIM systems
- Technologies or types of FIM systems
- 1.File comparison systems
- 2.Audit and Data Discovery Systems
- 3.File Integrity Monitoring with a data-centric security approach
- Data-centric security technologies that can help monitor file integrity
What is File Integrity Monitoring or FIM?
FIM refers to information security processes or controls that are responsible for validating the integrity of operating system files, software application binaries, or files such as office automation and other types of files that may contain sensitive information.
The objective is to monitor changes to the organization’s files and information systems, to determine if they have been altered or modified. It is actually a change control audit system that detects who, when and how it has been modified, and can generate alerts to further analyze the event and decide if any remediation action should be taken.
Why is a FIM system important?
The main use cases where a file integrity monitoring system can be useful are:
- Detect possible attacks on our infrastructure and information: If it is detected that certain operating system files, application binaries, web server configuration files, or confidential files stored in a repository have been modified, we may be facing a possible attack or security breach on our organization.
- Detect internal threats to our data: If certain users try to access or access data that should not be accessed we may be facing a problem of data leakage.
- Comply with data protection regulations: Many data protection regulations include among their compliance requirements the need to monitor the integrity of the organization’s files. Some of them are:
Information integrity: One of the objectives of an ISMS
The requirements of an Information Security Management System (ISMS), such as the one proposed by the ISO 27001, standard, consist of a series of measures aimed at protecting information against the different threats that can affect the continuity of an organization’s activities. The objectives of an ISMS are based on preserving what is known as the CIA triad (Confidentiality, Integrity and Availability).
The CIA Triad
The CIA triad is composed of:
- Confidentiality: This refers to the need to ensure that sensitive information is safe from unauthorized access attempts. To this end, measures are established to differentiate the level of access that certain people can have to the information, depending on the type or level of confidentiality of the data. The Zero-Trust Model for example, is based on the rule of “least-privilege access“, giving people only the access permissions they need and no more. Regarding confidentiality, we usually talk about encryption, 2FA, access control lists, etc.
- Integrity: This involves maintaining the consistency, accuracy and reliability of the data during its life cycle, preventing unauthorized persons from modifying the data in transit or at rest. In the field of integrity control, the concepts of digital signature and non-repudiation measures, cryptographic checksums or FIM systems which we will discuss later, usually appear in the field of integrity control.
- Availability: This refers to the ability to keep information consistently and easily accessible to authorized persons. Authentication mechanisms, storage, infrastructure and all the supports related to the information must work properly to make it available safe from crashes, failures, etc. As far as availability is concerned, we usually talk about preventive measures such as redundancy, failover, system monitoring, backups and continuity plans.
Reactive vs. proactive safety in the field of FIM systems
Depending on the type of process or FIM technology implemented, it is possible to have reactive or proactive security in relation to data integrity:
- FIM as reactive security: Thanks to FIM, it is possible to see who has modified a system, configuration file or binaries after an attack or malware injection. Thanks to FIM it is possible, for example, to revert a system to the situation before the attack.
- FIM as proactive security: If we are able to see which people are trying to access a suspicious number of files without permissions, accessing information from suspicious locations, checking permission assignments that may be out of the ordinary, we can anticipate a possible data leak or web security breach. Once the alert is detected, it is possible to take remediation actions to prevent a possible attack on our data.
Technologies or types of FIM systems
We will now look at possible techniques and tools that can be useful to an organization when it comes to maintaining data integrity.
1.File comparison systems
These systems check different aspects of a file to create a fingerprint of it. They initially establish a baseline of this fingerprint to compare changes to it.
Some of these systems check the following aspects of a file:
- Creation, modification and access permissions.
- Security settings.
- Contents of the file.
- Attributes and size.
- Content-based file hash.
- Configuration data.
- Credentials.
These systems take snapshots of the file regularly, randomly or based on certain rules or events configured by the security team. They are based on monitoring all components of the IT infrastructure such as:
- Servers and network devices.
- PCs and workstations.
- Databases, directories, middleware.
- Cloud Systems.
- Active Directory, etc.
They are usually part of a broader security audit strategy, which allows rollback mechanisms to be established on the data and ideally determine who, when and how any file in the organization has been modified.
Some of these systems compare the files with a central repository (e.g. original files or binaries of an application) and others the incremental changes made by the users, which is more optimal since there are usually customized applications adapted to the users.
In this segment of applications, we can find manufacturers such as Tripwire, Trustwave or Qualys.
These types of applications are more focused on the prevention of external attacks on the infrastructure and can even detect the management of changes in terms of security patches. They are not optimized in terms of integrity or monitoring access to confidential information and can generate excessive noise, alerts or false positives to detect attacks on the confidentiality of data as they monitor any change to any file, whether it is sensitive content or a simple operating system binary.
They are infrastructure-intensive, installed on servers and computers to monitor all changes and have an intra-perimeter focus. They cannot control the activity on the files once they have left the organization. Nowadays, where the perimeter of organizations has expanded or blurred with remote work, the boom of collaboration and the cloud, they can only respond to the management of the integrity of the files that are inside the organization.
2.Audit and Data Discovery Systems
These solutions are more oriented to identify based on patterns where sensitive information is located within the organization, monitor user permissions on file servers, and audit user access to files.
Going into more detail:
- They identify within a file server or other repositories, which files contain, for example, personal data, financial data, etc. , etc. In this sense, they perform an analysis or discovery of files similar to a DLP (Data Leak Prevention) system.
- They verify what permissions users have on files stored in these repositories. These are usually NTFS permissions or similar access control lists (ACLs). Some of these solutions have the ability to directly modify permissions (access the file or folder, modify it, delete it, copy it, move it, etc.) from the tool without having to manage it in AD or the file server.
- They audit access and copy operations deletion, etc. of files on these repositories.
In this type of solutions we can find manufacturers such as Varonis or Netwrixfor example.
Unlike the above, they typically focus on information regulated by GDPR, PCI, or other regulations and not on operating system binaries or applications.
Like the others, they depend on software installed at the file server and agent level for monitoring and scanning. Although they reduce the “noise” or volume of possible alerts on the information, since they are focused on regulated information, to trace any access to practically any file supposes too much data that must be analyzed, , and although the reporting systems can be tuned, they suppose an overload in the daily tasks of the network administrators.
Both solutions, focus not on the files themselves but on the repository, that actually contains them, folders on file servers, or similar, and all restricted to the perimeter of the corporate network. If the file has been sent to a collaborator, they can’t do anything.
3.File Integrity Monitoring with a data-centric security approach
As mentioned above, the above solutions are restricted to the perimeter of the network, but what happens if those files have been shared outside the network? Let’s take the case of teleworking users who have access to these files on their mobile devices, or when the information has been shared to storage clouds where the previous systems do not reach.
For a collaborative and remote work environment such as the one in which organizations operate today, file integrity monitoring must be able to be carried out wherever the file travels and in this sense be linked to the other two facets of the CIA triad, Confidentiality and Availability.
Wherever the data travels, we must be able to guarantee that it must be safe from uncontrolled modifications. It is necessary that only authorized people can alter the content of the documentation, following one of the maxims of the Zero-Trust approach: Give only the minimum privileges on the information that is required, and if someone should have read-only access, they should not have to modify it, print it to pass it to PDF and alter it later, or extract content.
There are not isolated cases of companies that, after a security breach in the network, have seen how, for example, a PDF with an account number for the payment of an invoice has been altered and the financial department has made the transfer to the attacker’s account.
Data-centric security technologies that can help monitor file integrity
Data-centric security technologies can help us monitor file integrity in the following ways:
- As it is the files themselves that are protected wherever they travel, we can monitor access and operations performed on them (e.g. blocked access attempts) regardless of where they are. It doesn’t matter if they are on a file server, in a cloud application or on a user’s PC working remotely.
- The control of access permissions over the content is granular. You can define who has access, with what permissions (only view, edit, print, copy and paste, export information, etc.), from where (which locations or IPs), and until when (it is possible to set expiration dates).
- Remediation actions can be taken based on risk and regardless of the location of the file: I can prevent access to a shared file even if it has been shared with a third party and you have it on your computer.
- Monitoring, permissions control and file integrity is not restricted to the perimeter. It does not focus on who copies or moves a file but who accesses the content, from which locations and with which permissions (view, edit, etc.) since information nowadays moves and travels quickly anywhere. On the other hand, monitoring is focused on the really important information, avoiding the levels of “noise” or false positives of previous solutions and reducing the workload of security and IT managers in this regard.
SealPath, a Data-Centric Solution for Monitoring the Integrity of Your Files
POn the other hand, thanks to data-centric security technologies such as IRM (Information Rights Management) or E-DRM (Enterprise Digital Rights Management), not only integrity control is covered, but also confidentiality, since the information is encrypted and access control is applied, and availability, where I can force offline access to the information or disconnected so as not to depend on online access to clouds, repositories, etc.
If you want to know how a data-centric security solution such as SealPath can help you monitor the integrity of your files while maintaining complete control of confidentiality, contact us and we will show you a personalized demo.
In addition to having remote control of file integrity and confidentiality, SealPath integrates with multiple auditing, classification and data discovery solutions or DLP solutions to automate the protection of especially sensitive information on file servers, PCs or cloud solutions such as Office 365, Box, or Google Workspace.