How to Address The Challenges of Data Security In Data Integration

As the amount of data stored by organizations grows, so does the requirement for data integration. Data integration projects that involve a variety of data sources can cause challenges with data security. Although data security measures can be implemented, they can often be insufficient to make integration processes adequately secure. Organizations need to re-look at their approaches to data integration to ensure security across all integrations. This may require changing some existing practices as well as implementing a more holistic solution to data integration.

Why is Data Security Important?

News stories in recent years about data security lapses have made most people aware of the damage such breaches can do to a company's reputation, let alone the impact on individual privacy. However, regulatory changes around data security and privacy are now adding a major financial impact on businesses. Companies that are found to have broken GDPR rules in Europe have had substantial fines. Other jurisdictions are now implementing similar laws - the Data Protection Act in the UK, the CCPA in California, and the CDPA in Virginia.

For the insurance industry, there is the Gramm-Leach-Bliley Act governing data security in financial institutions, and HIPAA to protect patient health information. These also impose onerous financial penalties on companies that violate the regulations.

What organizations need to do is establish a set of procedures and tools to protect digital data from unauthorized access and its misuse throughout the data life cycle.

Integration Security Problems

When it comes to data integration, businesses should recognize where integration security problems commonly occur.

1. The data extraction phase  

2. The transform phase of the ETL process

3. The data storage phase

4. The manual coding phase (People involved in the project)

Extracting Data

Most organizations have several disparate database systems. As each one has separate security access controls, security around extracting data is different for each one.

In financial services, some of the information in the databases is tokenized (actual data is converted to a token) so that private information is protected. However, in order to provide that information to a third party, the data is disclosed during integration. It is not uncommon that confidential data is converted into a form like CSV (Excel file), which is completely open so that the data can be integrated with other applications.

Transform (ETL)

Data in a data store behind a firewall tends to be considered secure. But when that data needs to be integrated with another data store or application outside the firewall, there is a danger that the data could be comprised during the transport from source to target.

There are a number of legacy ETL tools that transport and transform data but do not have data governance functionality for the data to be secure. When integrating different databases, part of the transform stage is to determine if data is corrupted or duplicated. This is done in an unsecured data store that is open to a data breach.

Development teams are often tasked to build a complex ETL process from multiple databases to a centralized target. It's only at the end of the project that data security is considered, and by then the overhead costs to add security become an issue.  


Data storage security involves protecting storage resources and the data stored on them – both on-premises and in external data centers and the cloud – from accidental or deliberate damage or destruction and from unauthorized users and uses. While some integration platforms include automatic encryption, plenty of other platforms on the market do not include these capabilities. That means organizations need to install separate software or an encryption appliance in order to make sure that their data is encrypted.

A growing number of enterprises are choosing to store some or all of their data in the cloud. Although some argue that cloud storage is more secure than on-premises storage, the cloud adds complexity to storage environments and often requires IT and development teams to learn new tools and implement new procedures in order to ensure that data is adequately secured.

Manual Coding (People Involved in the Project)

Many companies still turn to custom-built integration solutions. These rarely have the level of data security required. It is difficult to secure data across multiple data silos at every layer. Also, custom integrations result in a spaghetti architecture that leads to more vulnerabilities.

In IT departments where there are dedicated security professionals, there is often a disconnect between developers building integrations and the security teams. Usually, the responsibility is given to developers to do their best to secure data.  

An additional complication is that modern enterprises have various people involved in an integration project; not only staff but also contractors and consultants. This adds another layer of complexity as to who has access to important data.

A new area for security issues is low-code development. Low-code enables users from a non-traditional development background to participate in the development of integration. This can present a concern for data security.

How Synatic Helps

Data Extraction

Synatic's Relays are on-premise software agents that allow data to be extracted from a data source behind a firewall and moved to a cloud infrastructure whilst at the same time encrypting the data to ensure security.


Data that is extracted from a source database using Synatic is stored in a secure MongoDB database. Because it is a NoSQL database, data can be loaded 'as is' without doing any transformations first.  

Any data profiling and transformation within the secure NoSQL database is done under the security rules of the Synatic platform. This means that a data policy applied at a platform or application level is automatically applied to all the integration workflows within it.

For securing data during transport or transformation, Synatic’s Hybrid Integration Platform (HIP) allows different levels of access security, ranging from basic user-level access to more complex security options like JWT (JSON Web Token) and OAuth2 for securely transmitting information.

Synatic, on top of being a multi-tenanted cloud solution, can run in a customer’s cloud on Amazon's AWS, Google’s GCP, or Microsoft’s Azure cloud service. These cloud platforms provide excellent levels of security. For businesses who prefer keeping their data on-premise, Synatic can run in a private cloud environment as well as natively on an in-house Server.  

Data Storage

Synatic allows businesses to create a data warehouse that serves as a central repository of information, right inside the Synatic platform. All data within Synatic is encrypted at rest. This means that data in Synatic can be used without being moved into an external repository thereby limiting risk and minimizing the likelihood of data breaches. In addition to Synatic’s in-storage capabilities, the HIP has Buffers that allow businesses to store data while moving from one database to another. This improves data security by eliminating the need for the same data to be accessed and loaded repetitively.


Synatic’s HIP provides a comprehensive access control mechanism to prevent unauthorized access to data and application functionality. It uses the OAuth authorization protocol to create a 3-tier access portal to the Synatic system at a user, organization and implementation level each with its own level of security.

The Future of Data Integration Security

Data volumes are going to increase with a growing variety of database types – including NoSQL and graph databases – each created by specific applications. Therefore, the need to integrate different data sources to provide useful information for decision-making and reporting will also grow. In order to ensure the security of data integrations, it is no longer adequate to rely on developers’ skills (and honesty) when it comes to building custom integrations. Instead, businesses will need a Nimble, Simple, and Powerful HIP to manage and automate data integration, using established security protocols to safely and securely extract, transport and transform the data. To learn more about how you can ensure secure access to your organization’s data contact Synatic today.

Jamie Peers
July 20, 2021