[Access to Wang masthead]

PC/Unix Connectivity Through Simple File Transfer

From "Special Report",  Access to Wang, June 1996
  [ Prior Article ]     [ Return to the Catalog of articles ]     [ Next Article ]  

There are many instances in business information services where information needs to be shared between larger applications on a host system and spreadsheet and database systems on desktop systems. In some of these situations, there must be immediate, simultaneous access to the same information by all users, which usually means careful selection of development tools, large setup and administration costs, and an increase in skilled labor (database administrators, network technicians, etc.). If you are operating a system that requires this immediate access to information, such a solution is justified.

But what if your needs are less time-critical? What if overnight updates or periodic restores might meet your needs? Do you still need to meet the requirements of an interactive client/server system just to get an easy way to use information from your host applications within a spreadsheet?

This question deserves some consideration because the costs and potential for failure increase geometrically with the complexity of the solution. And if you have a Unix or VAX host, it is often easy to set up batch file transfers that fill most of the needs of larger, more complicated systems.

This article will review cases for such "batch" methods of transferring files between host and desktop systems, contrast them with richer solutions, and describe some of the ways such transfers can be set up.

Batch transfer versus full-time access

Before considering a batch transfer approach to file sharing you should quantify the expected results and decide whether they are worth the time, trouble, and cost. There are no right answers here: you must come to your own conclusions about whether the solution meets the informational needs of your organization within a reasonable amount of cost and labor.

Here are some of the many considerations that should go into this analysis:

If you're not sure what your real requirements are (and who among us is sure of these needs!), consider a pilot program for batch transfers with an evaluation later. As you will see below, there are some simple ways to get involved in batch transfers with little cost or effort.

Simple dial-up file transfer

If you are already connecting to your host system using a dial-up connection, you have several interesting choices for low- to medium-volume file transfers. File transfer services can be provided easily through these services, requiring little cost and administrative effort.

To perform batch file transfer you will need terminal emulation software with file transfer capabilities, similar protocol support at the host end (e.g. XMODEM, ZMODEM, etc.), and a line to connect the systems. The connection can be made using existing telephone lines and modems or by a number of other methods, including low-cost "modem eliminators" (a.k.a. "short- haul" modems) if the two systems are close enough to be connected with a telephone cable. File transfer protocol support can be provided at the host end through software supplied with the operating system (XMODEM, Kermit) or through inexpensive software products (ZMODEM).

The downsides of this approach include speed and convenience. Dial-up modems top out a 28,800 bit per second, so very large files will take a considerable amount of time. (If possible, use short-haul modems to boost speeds to 115,000 bps or better.) The transfers will probably have to be performed manually, since there are few transfer products with scripting capabilities robust enough to work in production settings. Some protocols have more features; ZMODEM, for example, can be set to convert file names and formats to those preferred by the receiving system.

Still, if you already have modems set up this would be a good way to get out of transferring information by diskette.

Proprietary file transfer within terminal emulation products

Some terminal emulation products go a step beyond support for standard dial-up file transfer, offering their own protocols and supporting tools. Like other proprietary solutions (the Wang VS, the Macintosh, etc.), this approach often means there is tighter integration but less choice. For example, high-end terminal emulation packages like Walker Richer & Quinns Reflection series include a proprietary file transfer system that allows easy point-and-click access to a host system that has been properly set up. Such systems often work independently of the connection approach, so you can use the same solution whether the user is connected via dial-up lines (relatively slow) or through a local network connection (fast).

Setting up vendor protocols is similar to dial-up protocols: the client system must be matched with a related host file transfer agent and a connection made between these applications. Most products include the host transfer agents, but some require a separate purchase.

Built-in transfer protocols are often easier to use, resulting in less training and support requirements than for standard protocols. There is often better support for scripting, too, so automated batch transfers are more feasible.

Using FTP services within the organization

Many host systems already have support for FTP (File Transfer Protocol) services, one of the classic members of the TCP/IP application suite. FTP is most often associated with access to files through the Internet, but it can be applied with similar success inside an organization.

FTP uses a similar connection model as Telnet, another Internet protocol that shares much of its underlying systems software. The user establishes a connection to another systems, selects files to send and receive, and closes the connection when the transfer is complete. Access can be anonymous (anyone can retrieve or, optionally, send files) or through standard authentication (user ID, password) and related access privileges. Classic FTP systems use a simple command syntax, but graphical "browser" systems are more typical for desktop systems.

To use FTP, its supporting services must set up at the host and related software must be available to the client. Host services are usually available by default on Unix and VAX systems, but may have been disabled by system administrators for security reasons. Client systems must be connected via a network (typically TCP/IP) and must have supporting software. There are many vendors providing FTP software, but good items are also available for free. If you use one of the many "integrated" network products (Chameleon, Reflection, Trumpet, Windows 95, etc.) you may already have a FTP agent for your desktop system.

FTP file transfer speeds are typically through direct network connections, so the transfer rate is much higher than through dial-up connections. Again, some products allow scripted sessions for transfer automation, but most do not. One form of FTP service - "trivial" FTP (TFTP) - gets around the limitations for batch transfer by allowing any files to be transferred between systems; this is a large security hole and should be avoided if possible.

Conclusion

The timeliness of information access can play a strategic role in the success of your organization, particularly when customer service is involved. At the same time, it is necessary to hold down the costs of providing access to this information. These goals can only be accomplished through careful analysis of the real information needs of the organization and application of appropriate information support resources.

In many cases, the real need is for timely views of information - not shared files. Application of the technology of the World Wide Web can serve these needs across a wide spectrum of platforms, connection approaches, and protocols. More on this in a later article.


  [ Prior Article ]     [ Return to the Catalog of articles ]     [ Next Article ]  


Copyright © 1996 Dennis S. Barnes
Reprints of this article are permitted without notification if the source of the information is clearly identified