Understanding Managed File Transfers.

Considerations and Challenges.

In many organizations, a large part of the exchange and distribution of information is realized by the copy and transfer of data files. As the number of files is ever increasing, the results are serious management and security issues. These issues are predominantly operational, so awareness at management level is typically low.

Consider the following use cases about Managed File Transfers.

For years, corporations have been moving from bespoke to standard software: traditional locally installed packages or cloud delivered SAAS packages. Generally, standard software does not allow access to data directly but instead provides an application layer interface. For bulk transfers, the service/message style interfaces perform poorly, so file-based interfaces are common for these kinds of transfers. Files often contain the right information in the wrong format. Rather than script file changes and build an unmanageable library of scripts there is a need for a Managed File Transfer (MFT) product.

Recent trends lead to more data being stored in files:

  • Hadoop’s HDFS provides scalable, sheer unlimited capacity to store any files. Partly due to the limitations to update files history is often File Transfer to the Cloudstored in completely separate files.
  • More and more applications generate or use rich content like media (photos, videos) which is also commonly stored. Data Lakes, on-premises (dominantly in Hadoop), and in the Cloud using storage platforms like AWS S3 and Azure Data Lake Store (ADLS)  provide a scalable platform to store data in files.
  • Corporations often use workplace software like SharePoint to formalize desktop processes and information. To link these processes to transactional systems there is a requirement to integrate these systems with workplace & office files, and with that the need for MFT. A SharePoint frontend on a transactional system also eliminates all kinds of unwanted direct access to business applications.

Managed File Transfer Challenges

Now there is not only a management problem, but also a security risk. The end result is a risky combination of high costs and high risks.

SCP and (s)FTP are commonly used file transfer protocols but they offer no guarantees in delivery of files. To ensure delivery additional scripting is needed. As a result, every file transfer needs its own script, the number of scripts explodes and the environment becomes unmanageable. Creating a script is easy, maintaining and operating it is not. To address those issues, organizations sometimes deploy a “file exchange platform” to manage these scripts. Due to the nature of scripting, this platform (often not more than a server with storage and lots of scripting) soon becomes a management nightmare itself. Most of the scripting is not documented, it is often unknown what exactly is happening and no real management is possible. Whenever there is an interruption, fixing scripts involves reverse engineering, long downtimes and even “hacking”. When file transfer to customers or partners and with that external transfers become necessary, the file exchange platform is often duplicated to the DMZ. Now there is not only a management problem, but also a security risk. The end result is a risky combination of high costs and high risks.

HVR provides Managed File Transfer functionality. This can be used as a tool in its own right, but also as part of the integrated HVR suite for enterprise data integration. Complex file transfer chains can be configured, scheduled and controlled from a central point on an enterprise-wide level.

HVR supports three types of managed file transfers: file-to-file, database-to-file and file-to-database.  Look forward to my next blog post as it will go into detail on these types of transfers.

 

About Mark

Mark Van de Wiel is the CTO for HVR. He has a strong background in data replication as well as real-time Business Intelligence and analytics.

© 2019 HVR

Live demo Contact us