Large file transfers are becoming routine, thanks to two seemingly irreversible trends: the rise of rich media content and the growth of virtual workforces, bolstered by many types of outsourcing relationships. As a result, companies are exchanging files more regularly than ever before and everything from email attachments to e-business messages commonly contains a large payload of data.
Some organizations attempt to manage these exchanges using simple file transfer software or proprietary message bus technologies. Others depend on integration platforms built around legacy networking standards and outdated transport protocols. In both cases, system administrators must spend a great deal of time keeping information flowing in a secure, manageable way.
In this article I'd like to explain how to simplify the setup, processing, and overhead related the transfer of large files. While business users simply think about moving information from point A to point B, IT pros have to make sure these exchanges are flexible, reliable, auditable, and secure.
The integration space can get a bit murky so let's start by defining some key terms. Managed file transfer (MFT) platforms coordinate the movement of information, data, and transactions from a source system to a destination system. Most of these integration scenarios are characterized by the following:
- Very large files
- A wide variety of message types
- Batched message packages
- Strict security and auditing requirements
- Traditional, legacy and proprietary file transport protocols
IT shops have deployed many types of applications for moving files, sending messages, and routing transactions. These applications support communication protocols like MQ Series, System Network Architecture (SNA), AS1, AS2, AS3, Connect:Direct and -increasingly --File Transfer Protocol (FTP).
Some organizations have deployed one type of infrastructure for transferring files, and another for handling inter-application messages. While it's nice to have best-of-breed software platforms for every activity in the data center, there is a lot of extra overhead associated with maintaining each platform.
On the other hand, it may be difficult to "force fit" certain integration scenarios onto platforms that weren't designed for the purpose at hand. For example, you may be required to use a particular file transfer protocol when you communicate with a partner's enterprise service bus, yet your own ESB doesn't support that protocol or properly handle the related file management activities.
Sorting Through the Options
One of the reasons these integration scenarios have become so complicated is because of a blurring of boundaries. Traditional file transfers are starting to look a lot like messaging, and messaging is taking on many of the attributes of file transfer. For example, some file transfer applications are handling traffic that has become operational in frequency and duration --characteristics associated with basic messaging. Meanwhile, message brokers are being used to control business processes, incorporate batch information, and carry out a lot of the work traditionally associated with file transfers.
Ideally you should be able to handle all types of file transfers and messaging scenarios with the same software platform, enabling you to use one set of skills, one set of monitoring practices, one set of security policies and consistent auditing constraints for all types of integration activities.
Let's start with the basics: FTP. This is by far the most common protocol. It's free, its fast, and it's ubiquitous. However, FTP doesn't provide monitoring, security, process control or support for multiple message formats. It's also quite rigid. Different organizations have different requirements, and there are many ways of packaging transport payloads and managing batches of business messages -- important nuances that escape FTP's basic capabilities.
Once you send information across organizational boundaries you commonly confront additional security and auditing requirements such as authentication, message exchange non-repudiation (common with many e-business systems), and counters for checking the integrity of batch and file based mechanisms. Here again, FTP isn't much help.
What if an FTP server had the capabilities of a full-featured integration server, such as built-in support for delivery confirmation, auditing, and direct process integration? What if it could be used as a message-based integration server irrespective of the partner's infrastructure? This would eliminate the need to maintain separate software applications for these purposes.
Today's software vendors are aware of these needs and are enhancing their integration servers to support multi-enterprise integration and managed file transfer. The best integration platforms -- also called MFT Suites -- can augment FTP exchanges using a range of techniques and formats, including parsers that split messages, multi-threaded messages, aggregators, and non-XML message processing options. In addition to supporting a wide range of standard and proprietary protocols, these integration environments can adapt existing protocols from their original intent (file transfer) to include secure, managed transport of messages.
When shopping for an integration server to extend FTP's basic capabilities, make sure it has the following:
- A complete set of adapters for communicating with virtually any type of system in a native protocol or API
- Support for every conceivable e-business message format, such as EDI, SWIFT, HIPAA and HL7
- Integration with a complete line of PKI-based security providers for both transport and message-level security
- A flexible processing pipeline to handle enveloping, splitting, and aggregating
- Open and flexible deployment options, from thin-footprint JVM processes to J2EE application servers
In contrast to simple file-transfer solutions, this type of integration platform will minimize infrastructure requirements and provide tremendous visibility into file transfer processes.
Summing Up the Benefits
As organizations extend their supply chains, route orders, manage inventory, and carry out a host of other common business tasks, IT managers find themselves wrestling with many integration issues related to transferring files. To improve productivity and reduce development costs, they should strive to consolidate their infrastructure and adopt consistent operational procedures.
In the long run, IT managers will save time and money by acquiring a cohesive integration platform that automates file transfer activities as part of an enterprise process. These activities can be as simple as moving files securely in a controlled fashion between two sites or as complex as executing multi-step messaging scenarios, with complete auditing, notification, and security.
When properly deployed, these integration platforms enable you to consolidate and centralize all external communications. You can always integrate the right information, even when the requirements span multiple locations, multiple organizations and multiple protocols. A good integration platform also handles trading partner management and can work with portals and Web forms to simplify administration.
With a general-purpose integration platform such as this, you can even handle most of the issues related to transferring large files via your service bus. You can easily monitor, mange and audit the transfer of large files, thanks to innovative technologies like splitting, streaming and off-bus execution.
iWay Software provides rapid integration solutions that help companies large and small meet new business challenges without making their vital information investments obsolete. http://www.iwaysoftware.com
Interested in information related to this topic? Subscribe to our Information Technology eNewsletter.