Building a Portable Programming Environment

Ian describes the working PC development environment he's built using the MKS Toolkit, a set of UNIX-like utilities for MS-DOS. It's his contention that the environment makes him much more productive when using a DOS-based PC to write programs for platforms ranging from micros to mainframes.


May 01, 1993
URL:http://www.drdobbs.com/windows/building-a-portable-programming-environm/184408999

MAY93: BUILDING A PORTABLE PROGRAMMING ENVIRONMENT

Ian is a developer for the Canadian Technology Marketing Group and can be contacted at 11 Holland Ave., Suite 700, Ottawa, Canada K1Y 4S1.


My company does system development for a number of different operating systems, including MS-DOS (local and network), UNIX, VAX/VMS, and MVS/TSO (IBM). We do most of the programming on the clients' computers, and many of them have only the minimum necessary programming tools (editors, compiler, linkers, and so on). Even if more tools were available, however, differences between sites would prevent us from acquiring the familiarity necessary for greatest productivity.

In general, we use DOS-based PCs as terminals to minicomputers and mainframes, doing much of the work on the PCs. This is more productive because coding the programs for large systems is really a text-processing operation, and DOS PCs allow us to use the same set of effective and familiar tools to handle text for a variety of operating systems. We don't use this approach with UNIX, because UNIX already has a good set of tools, and the differences between UNIX systems are not great.

About a year-and-a-half ago I began using the MKS Toolkit, a set of UNIX - like utilities for MS-DOS. The MKS Toolkit makes a DOS PC much more productive when writing programs for MS-DOS computers, minicomputers, and mainframes. The Toolkit also makes the DOS working environment much more like UNIX, so that it's easier to move between the DOS development environment and a UNIX system.

There are good reasons for setting up a UNIX-like programming environment on DOS PCs. First, the UNIX environment, with a shell and utilities, makes it easy to build useful, special-purpose utilities with very little effort, because much of the work is already done. Second, many of the UNIX utilities (awk, grep, sed, and so on) are designed to handle problems that arise frequently when working with text files. Third, MS-DOS PCs are available (or can be brought in) everywhere. Finally, it's easy to connect an MS-DOS PC to practically any other system using common terminal programs.

In this article, I'll describe the working environment I have set up on the PC, then provide examples from two different projects that illustrate how the environment reduces software-development costs. A UNIX system programmer would not consider either of these examples to be in any way remarkable, but the two projects represent cost savings not much less than my annual salary and are obtained using UNIX tools in a non-UNIX environment.

The Environment

An important feature of the environment is online reference manuals. Wherever possible, I select programs that have both online and hardcopy reference manuals. While it's easier learning programs with hardcopy manuals, it's faster to use the online manual after you've mastered the software. Also, as I move from site to site, I prefer to carry everything on a few floppy disks, leaving the manuals at home.

The foundation of the working environment is the MKS Toolkit: a collection of about 150 UNIX utilities and commands, including a Korn shell (which combines features of the C and Bourne shells). You can run these utilities from the DOS command line like any other DOS program, or you can run the Korn shell and work from a UNIX-like command line. Under the second option, you can still run all of your MS-DOS programs from the command line.

The MKS Toolkit has an excellent online reference manual that uses man pages similar to UNIX systems, except that the man pages are better written and easier to understand. Also, it is easier to browse back and forth in a man page than it is with some UNIX systems. The MKS online man pages are identical to the hardcopy man pages in the 500-page manual. There's also a good tutorial (only in hardcopy).

The MKS Toolkit is designed to follow POSIX. Most of the man pages have a Portability section that tells you how closely a utility follows specifications in POSIX.1, xOPEN, BSD UNIX, and SYSTEM V UNIX. The differences between the MKS Toolkit and UNIX systems are no more than the differences between several UNIX systems-SCO UNIX, DYNIX (BSD UNIX), SunOS, and Mach.

The biggest difference between UNIX and the MKS Toolkit imitation of UNIX is the absence of multitasking. In MKS, you can't run a process in the background, and you can't run a script that requires simultaneous processes. Except for this, scripts that run in the MKS Toolkit can be run in UNIX with minor changes, and vice versa. MS-DOS running the MKS Korn shell is similar to SCO UNIX running the Korn shell.

Although the MKS Toolkit gives you some useful and powerful tools for manipulating files, it's no substitute for a good file manager. Consequently, my file manager is XTree Gold 2.5, which is also available for some UNIX systems. If you prefer another file manager--and can start it from the DOS command line--it should work with the MKS Toolkit.

I use Norton Utilities partly for some of the utilities, and partly for the NDOS command processor. You still need a DOS command processor with the MKS toolkit. The MKS Korn shell executes DOS batch files by passing them to a DOS command processor. Also, programs like XTree and WordPerfect that let you execute a DOS command must execute the command through a DOS command processor. COMMAND.COM has given me trouble (with pipes and redirection) when the default drive is a network drive. Switching to NDOS resolves these troubles. The NDOS shell accepts any commands or batch files that work with COMMAND.COM, and has a good online manual for MS-DOS and NDOS. Furthermore, using NDOS as the MS-DOS command shell reduces the differences between MS-DOS versions 3, 4, and 5.

When Kermit is installed on the target system, I use it as my terminal emulator to download or upload text files. Otherwise, I use whatever my client normally uses. If the client didn't use Kermit or any other emulator package, I'd use Kermit's ASCII text-transfer functions to upload and download files.

Kermit has ASCII documentation files that can be converted into man pages for MKS Toolkit use, although the two files are so long that you need to put an index at the top. This takes less than five minutes when using grep and vi with the MKS Toolkit. Kermit also has a good programming language to create scripts; I've used Kermit scripts to execute sequences of commands on a VAX/VMS system.

Example Project #1

In the first project I'll describe here, I had to produce 89 reports from an SAS database on an IBM 3090, operating under TSO. The database contained 8000 variables describing various aspects of the performance of several mainframes with hundreds of peripheral devices. I had no previous knowledge of the data-base, but the online documentation for the database was excellent. I had 50 days to learn the database, identify the required data, and produce the reports and documentation. The completed set of reports contained 2,947 SAS statements in 11,000 lines of code and comments.

Although the documentation was good, it was cumbersome and seldom used. Likewise, a hardcopy version would have been unmanageable, both because of its bulk (10,000-plus pages) and because of confusing cross-references (a variable description would say "see also" for several related variables and other information). While the mainframe menu system was set up to let you easily find and view individual descriptions, it still took several minutes to track down the cross-references. By the time you had finished, you'd forgotten much of what you had learned. I solved this by building a simple hypertext documentation system on the PC to use for the actual programming work. I used MKS vi as the display program, in the same way as I would use UNIX vi.

I first identified the appropriate data-base files, then downloaded the description of each file from the mainframe to the PC. At the end of each file description was a list of variables, each line giving a code that described the time frames covered by the variable, the variable name, and a brief one-line summary description. I used each description as input to a short script (see Listing One, page 106) that extracted the variable list from the file description. The variable list itself became one of the parts in the hypertext system, but was also used as input to the next stage of construction.

The second stage was to download the variable descriptions. The terminal emulator would only download one description at a time, and there were 2000 descriptions to download. Fortunately, the download was implemented as a DOS command, that runs inside an MKS Korn shell script (see Listing Two, page 106). Line 23 of the script is two commands--the output of one command becomes part of the command line for the other. The awk command extracts the first field of every line in the file named by $3, which is the third parameter in the command line that invoked the script. Enclosing awk in backquotes makes the list a part of the for command, which invokes the following loop once for every name in the list, letting $i represent that name on each iteration. Line 27 does the work; $1/$i is the DOS directory and filename, and '"$2"($i)' is the TSO PDS name and member name, in the form PDSname (member). I made the mistake of downloading everything into one DOS directory. DOS file access is very slow when a directory has hundreds of files, so I used XTree Gold to relocate the files in groups of 125 to different directories.

The third stage was to make the index file that the hypertext system uses to find files. MKS vi, like UNIX vi, has a tags facility--if you put the cursor on a word that is the first string in a line of a sorted tags file, you can press Ctrl-] and vi will display the file named in the second string of the same line. The downloaded descriptions were in files that had the same name as the variable being described. The script in Listing Three (page 106) builds the index. Line 7 scans all files and subdirectories of c:/mic/doc to produce a list of only the files. This list is piped into the awk program (lines 9 to 15), which extracts the filename from each complete pathname, and produces a line with the filename, a complete filename, and a cursor address (actually, a line number) in the file. The new list of lines is piped into a sort to ensure that the list will be in the correct order.

Listing Four (page 106) shows the script that actually runs the system. Line 13 executes when the script has no arguments, to bring up a master list of database files. Putting the cursor on any filename and pressing Ctrl-] brings up the list of variables for that file. Line 15 executes when the script is given a database filename as an argument and goes directly to the list of variables for that file. The set ignorecase|1 statement was an attempt to make the system independent of case. It didn't work because the tags facility is case sensitive even when case sensitivity has been turned off for editing.

The final step was to put the line set tags=c:/mic/tags in the initialization file (ex.rc) for vi. This integrated the hypertext system with the editor, allowing me to see documentation while editing, and allowing me to cut and paste from the database documentation to comments in the SAS programs.

The time needed to download the descriptions and build a hypertext system out of 3 Mbytes of text in 2000 files was one day. The system probably took 10 to 20 days off the time I would have needed to complete the project, just by making it easier for me to use the database documentation.

Example Project #2

With another project, we had to produce on a VAX/VMS programs for about 250 reports as one part of a database. The number of different report formats wasn't great; as many as three dozen reports had the same layout, and differed only in details such as data and headings. Each program required 700 to 2000 lines of code, totalling about 250,000 lines of code. The problem was that we originally estimated a much smaller amount of code and were committed to delivery in three months. With one person on the job, it would take a year. I was sent in to assist and discovered that we could cut more than 100 working days off the job, making it possible for two people to deliver the code within three months.

The database language posed some problems in this particular application. A considerable amount of manual calculation was needed for things like the location of data or a heading on a report line. The code was very repetitive. The macro facility would not eliminate all of this repetition, and could not be used at all if we wanted to compile the code. Running the code with an interpreter instead of a compiler would give performance below our contract specification. The limitations on the use of procedures caused an increase in code repetition.

The cure for these problems was to automate much of the repetitive work and to find a way to write things in detail only once. For example, if the same complex data-set specification appeared at several points in a program, we could name it and then use the name everywhere.

M4 (found in both UNIX and the MKS Toolkit) is an ideal tool for this job. You can use M4 like a C preprocessor or an assembly language macro processor, but you can apply M4 to any programming language. For such a powerful tool, M4 is simple (the documentation is only eight pages long).

We started by getting specifications for all of the reports before doing any programming. We then classified the reports into groups of similar reports, looking for common features. These common features formed building blocks for programs--a general set of M4 macro building blocks for all programs and a separate set of macros for each group of similar programs. We wrote a generating file for each program, which built that program out of the general macro set and a set of group macros.

A group macro and one program macro would typically have more lines of code than the generated program, but would take less time to develop. A shorter development time was achieved by eliminating redundancy; the program was ready for test sooner (less work) and was less likely to have bugs. (There are no discrepancies when things are only defined once.) The greater number of lines was mainly because M4 is such an ugly language; it takes many comments to make an M4 program understandable (even to yourself!).

After the first program macro was done, the other programs in the group went very quickly. Nearly all of the analysis and testing were done when the group macro and the first program macro were completed. The elimination of redundant work produces much more code than you actually write, so two people can average about 5000 lines of finished code per day.

Listing Five (page 106) shows two examples of the macro building blocks for one group of programs. Listings Six and Seven (page 107) show examples of the code generated by two one-line calls to macros in Listing Five.

The first example is the use of a macro to repeat the same code fragment in different places and to permit us to specify column positions by relative column number instead of by the absolute character position on the line. Lines 147 to 167 of Listing Five define a macro_colheads_ that is invoked without arguments by the one-line call _colheads_(). The result is shown in Listing Six. The_colheads_macro invokes the macro_zcolcentr_14 times (lines 153 to 156), to center headings over each column of data in the report. The_zcolcentr_macro (not shown) converts the relative column number (parameter 1) to an absolute position and adjusts for the length of the string (parameter 2), so that the column heading will be centered over the data for that column. The conversion of the column number to position is based on constants that were created by executing another macro repeatedly to calculate the position of each column from the position and width of the previous column.

The second example shows how we used M4 macros to get around limitations on the use of procedures in the database language. To meet performance specifications, we had to use indexes on searches. The database would do a very fast search if comparing an index key to a constant, but would scan the data set if comparing the index key to a variable. This meant that a variable would have to be passed as a string parameter to a macro if we wanted a fast search. But we could not use macros in compiled code and we needed compiled code to meet performance specifications.

Lines 384 to 398 of Listing Five define a macro, _getanswers_, that generates lines 90 to 96 of Listing Seven. Lines 408 to 427 define a macro, _finddata_, that repeats the first macro 12 times (see lines 415 to 426). For coding and testing, the result is the same as if we write a procedure--we invoke 12 search operations with a line very similar to what we would have done with a procedure: _finddata_('4831988','32.5','=130','=1'). We were able to specify the search values in a way that meant something to the client, but still have the search values concatenated with the month values into constant strings that allowed fast search. Some of the programs required several of these calls, each of which generated 96 lines of compilable code from one line of code we wrote.

One of the errors I made in this project was to design excessively complicated macros because I did not fully understand M4. I fell into the trap of writing the macros as if they themselves were procedures. There are no procedures in M4, even though I used M4 macros to make building blocks that looked like procedures in the target language. When M4 evaluates a macro, the result is appended to the text just processed.

Listing Eight (page 107) shows a complex macro that was invoked by an outer macro that generated 60 to 110 lines of code from a single one-line call. The macro is invoked from within the outer macro by the call _flagtotal ('x','y','z'), where z is a code that selects lines 258-262, 268-270, or 274-275 of the inner macro.

Listing Nine (page 107) shows the single macro split into three simpler macros with different names. If the selection code were the second parameter of the outer macro and had the value 0, we could invoke the appropriate inner macro by the call _flagtotal_$2_('x','y'), which would become _flagtotal_0_('x','y') when the outer macro was scanned and replaced. The result of the outer macro would be scanned again before being appended to the output text and the new name would be recognized as a macro, causing further replacement by lines 9 to 13 of Listing Nine.

Either choice of macros generates the same code (see Listing Ten, page 107), but Listing Nine, being less inscrutable, demonstrates the better choice.

Conclusion

The value of UNIX utilities and shell programming lies in the ability to construct useful tools with surprisingly little effort. The tools described here are not likely to be useful on many projects, but I built them at far less expense than the savings they generated. This is typical--the cost of building tools is low enough that it is worthwhile to build a tool for use on only one project. MKS Toolkit, by bringing UNIX utilities and shell programming to DOS, makes it possible to achieve similar savings in non-UNIX environments. The availability of MS-DOS systems make it possible to take your working environment just about anywhere.



_BUILDING A PORTABLE PROGRAMMING ENVIRONMENT_
by Ian E. Gorman


[LISTING ONE]


  1 :
  2 # Extract variable names from a MICS file description
  3 # Ian E. Gorman, Canadian Technology Marketing Group, Ottawa
  4
  5 # Input:
  6 #   A section of the MICS Data Dictionary describing the file and
  7 #   listing the variables.  For example, Section 4.4, The HARDVA
  8 #   (Hardware Device Activity) File.  The variables list is the last
  9 #   section of this file.
 10 # Output (tab-delimited fields):
 11 #   1   variable name
 12 #   2   flags, in format "XDWMYTEC" (See the MICS data dictionary)
 13 #   3   variable description
 14 #   The "C" flag is not described in MICS, and is the first letter
 15 #   of the most recent subcategory heading in the variable list
 16
 17 awk '
 18
 19     {   # remove the carriage control character
 20         gsub("^."," ")
 21         }
 22
 23     output_flag == 1    {   # beginning of variables list
 24         output_flag = 2
 25         }
 26
 27     /^  *--*  *--*  *--* *$/    {   # heading at top of variables list
 28         output_flag = 1
 29         next
 30         }
 31
 32     output_flag < 1 {   # have not reached variable list yet
 33         next
 34         }
 35
 36     /^ [A-Za-z]/    {   # Sub heading
 37         category = substr($1,1,1)
 38         next
 39         }
 40
 41     /^ *$/  {   # null line
 42         next
 43         }
 44
 45     {   # variable description line
 46         flags = sprintf("%-7.7s%1.1s", $1,category)
 47         gsub(" ",".",flags)
 48         title = ( substr( $0,index( $0,"-")+2 ) )
 49         printf("%-10.10s%-10.10s%s\n",$2,flags,title)
 50         }

 51
 52     '   "$@"    |
 53
 54     sort        # new order, ASCII sequence instead of EBCDIC







[LISTING TWO]


  1 :
  2 # 3270 command to download a PDS
  3 # Ian E. Gorman, Canadian Technology Marketing Group, Ottawa
  4
  5 # Transfers MVS PDS members as individual files to MS-DOS directory
  6 # Marks each received file as unchanged (turns off the archive bit)
  7 # Works only if all members are text
  8 # Requires a DOS file which contains a list of members to transfer
  9 # Member names are in field 1 of the file
 10 # You can get a list of members by using the TSO ISPF dataset list
 11 # (main menu option 3.4), choosing 'X' (print index listing), exiting
 12 # from ISPF, and noting the name of the saved listing file.  Download
 13 # this file, and delete all the lines which do not have member names.
 14
 15 if [ $# -ne 3 ]
 16 then
 17     echo "Usage: $0 DOSdir MVSpds MVSmemberlist"
 18     echo "  Download files in MVSmemberlist from DOSdir to MVSpds"
 19     echo "  Works only for text.  Requires full MVS name for MVSpds"
 20     exit 1
 21 fi
 22
 23 for i in `awk '{print $1}' $3`  # get filenames from a list of members
 24 do
 25     echo
 26     echo MVS="'"$2"($i)'" to DOS="$1/$i"
 27     if  c:/pc3270/receive "$1/$i" "'"$2"($i)'" ascii crlf   # download file
 28     then
 29         chmod a=rwx "$1/$i"         # if success, turn off archive bit
 30     else
 31         echo "\n$0: cannot continue"    # if fail, stop entire run
 32         exit 2
 33     fi
 34 done






[LISTING THREE]


  1 :
  2 # make a 'tags' file for MICS documentation
  3 # Ian E. Gorman, Canadian Technology Marketing Group, Ottawa
  4
  5 cd c:/mic
  6
  7 find c:/mic/doc -type f -print  |
  8
  9 awk -v 'OFS=    '   '
 10     {   basename = $1
 11         sub("^.*\/\$?","",basename)
 12         sub("\.[0-9]*","",basename)
 13         print toupper(basename),$1,"1"
 14         }
 15     '   |
 16
 17 sort '-t    '   >tags






[LISTING FOUR]


  1 :
  2 # crude form of hypertext for MICS documentation
  3 # Ian E. Gorman, Canadian Technology Marketing Group, Ottawa
  4
  5 # Uses the 'tags' feature of 'vi' to establish links from names to files.
  6 # Put cursor on a name, and hit ctrl-] to go to next file.
  7 # Tags file is in c:/mic
  8
  9 cd c:/mic
 10
 11 if [ $# -lt 1 ]
 12 then
 13     vi -r -c 'set ignorecase|1' index
 14 else
 15     vi -r -c 'set ignorecase|1' doc/1/cat1/$1.1
 16 fi






[LISTING FIVE]


  1 dnl mlnonmet.m4 -- m4 definitions for monthly ledgers, metals and nonmetals
  2 dnl Ian E. Gorman, Canadian Technology Marketing Group
***
***
144 dnl
145 dnl ------------------------------------------------------------------------
146 dnl
147 define(`_colheads_',`dnl -- column heads
148 dnl -- parameters: none
149 dnl When calling this macro, do not terminate the call with "dnl".
150 dnl The last line of this macro is left unterminated so that the line
151 dnl     can be terminated either with CR or backslash-CR in the macro call.
152 dnl
153 _zcolcentr_(`1',`RSN',`newline') \
154 _zcolcentr_(`2',`Company Name') \
155 _zcolcentr_(`3',`JANUARY') \
156 _zcolcentr_(`4',`FEBRUARY') \
157 _zcolcentr_(`5',`MARCH') \
158 _zcolcentr_(`6',`APRIL') \
159 _zcolcentr_(`7',`MAY') \
160 _zcolcentr_(`8',`JUNE') \
161 _zcolcentr_(`9',`JULY') \
162 _zcolcentr_(`10',`AUGUST') \
163 _zcolcentr_(`11',`SEPTEMBER') \
164 _zcolcentr_(`12',`OCTOBER') \
165 _zcolcentr_(`13',`NOVEMBER') \
166 _zcolcentr_(`14',`DECEMBER')dnl
167 ')dnl -- _colheads_()
168 dnl
169 dnl ------------------------------------------------------------------------
170 dnl
***

***
384 define(`_getanswers_',`dnl --   find Answers data
385 dnl -- parameters:
386 dnl 1   Qid and QvrsYr, e.g. ` 4501988'
387 dnl 2   QCellSect, e.g. `28.5'
388 dnl 3   QSrvyCensMn, e.g. ` 3'
389 dnl    Uses macros __ROWCONDITION__, __COLCONDITION__ defined by _finddata_
390 dnl
391 find all _SetAns_ \
392     where QCellSect   = "$1$3$2" and \
393           CurrentFlag = "Y" and \
394           QCellRow    __ROWCONDITION__ and \
395           QCellCol    __COLCONDITION__ and \
396           SrvyCensYr  = fa2000.ScYrFr -> _REPTNAME_()SetAns
397 ListData ()'dnl
398 )dnl -- _getanswers_()
399 dnl
400 dnl ------------------------------------------------------------------------
401 dnl
402 dnl -- parameters:
403 dnl 1   Qid and QvrsYr, e.g. ` 4501988'
404 dnl 2   QCellSect, e.g. `28.5'
405 dnl 3   condition for QCellRow, e.g. `in (80,90,100)'
406 dnl 4   condition for QCellCol, e.g. `= 2'
407 dnl
408 define(`_finddata_',`dnl -- find Answers data
409 dnl -- parameters: none
410 dnl usually preceeded by "find number" and followed by the name of new set
411 dnl
412 define(`__ROWCONDITION__',$3)dnl -- avoids the need to nest quotes on argume
nts
413 define(`__COLCONDITION__',$4)dnl
414 % Find data separately for each month, to allow use of an indexed field
415 _getanswers_(substr(x$1,1),$2,` 1')
416 _getanswers_(substr(x$1,1),$2,` 2')
417 _getanswers_(substr(x$1,1),$2,` 3')
418 _getanswers_(substr(x$1,1),$2,` 4')
419 _getanswers_(substr(x$1,1),$2,` 5')
420 _getanswers_(substr(x$1,1),$2,` 6')
421 _getanswers_(substr(x$1,1),$2,` 7')
422 _getanswers_(substr(x$1,1),$2,` 8')
423 _getanswers_(substr(x$1,1),$2,` 9')
424 _getanswers_(substr(x$1,1),$2,`10')
425 _getanswers_(substr(x$1,1),$2,`11')
426 _getanswers_(substr(x$1,1),$2,`12')'dnl
427 )dnl -- _finddata_()
428 dnl
429 dnl ------------------------------------------------------------------------
430 dnl






[LISTING SIX]

       Code That is Generated by the Macro Call  _colheads_()
Headings are Centered over the Corresponding Data in Each Column of Report

218     "RSN"                                  : column 6 newline: \
219     "Company Name"                         : column 26 : \
220     "JANUARY"                              : column 54 : \
221     "FEBRUARY"                             : column 68 : \
222     "MARCH"                                : column 85 : \
223     "APRIL"                                : column 100 : \
224     "MAY"                                  : column 116 : \
225     "JUNE"                                 : column 130 : \
226     "JULY"                                 : column 145 : \
227     "AUGUST"                               : column 159 : \
228     "SEPTEMBER"                            : column 173 : \
229     "OCTOBER"                              : column 189 : \
230     "NOVEMBER"                             : column 203 : \
231     "DECEMBER"                             : column 218 : \






[LISTING SEVEN]

       Code That is Generated by the Macro Call
                           _finddata_(` 4831988',`32.5',`= 130',`= 1')
       Lines 104 to 166 are nine more repetitions of similar code
   The repetitions differ only in the "where" lines (compare 91 to 98)

 89 % Find data separately for each month, to allow use of an indexed field
 90 find all Answers  \
 91     where QCellSect   = " 4831988 132.5" and \
 92           CurrentFlag = "Y" and \
 93           QCellRow    = 130 and \
 94           QCellCol    = 1 and \
 95           SrvyCensYr  = fa2000.ScYrFr -> p2029SetAns
 96 ListData ()
 97 find all Answers  \
 98     where QCellSect   = " 4831988 232.5" and \
 99           CurrentFlag = "Y" and \
100           QCellRow    = 130 and \
101           QCellCol    = 1 and \
102           SrvyCensYr  = fa2000.ScYrFr -> p2029SetAns
103 ListData ()
***
***
167 find all Answers  \
168     where QCellSect   = " 48319881232.5" and \
169           CurrentFlag = "Y" and \
170           QCellRow    = 130 and \
171           QCellCol    = 1 and \
172           SrvyCensYr  = fa2000.ScYrFr -> p2029SetAns
173 ListData ()







[LISTING EIGHT]

      Example of Macro That Is Too Complicated -- Compare to Listing 9
     Lines longer than 80 characters have been wrapped at 80 characters

  1 dnl mlnonmet.m4 -- m4 definitions for monthly ledgers, metals and nonmetals
  2 dnl Ian E. Gorman, Canadian Technology Marketing Group
******
243 dnl
244 dnl ------------------------------------------------------------------------
245 dnl
246 define(`_flagtotal_',`dnl --    include a set of flag totals in a report column
247 dnl -- parameters:
248 dnl 1   Relative column number on the page
249 dnl 2   Column ID (month)
250 dnl 3   selection:  0 ==> save flag
251 dnl             1 ==> print zero-suppressed total, print flag
252 dnl             2 ==> print total without zero suppress
253 dnl Parameter 3 selects one of three ifelse statements for expansion
254 dnl This macro generates code that both prints the total and saves the
255 dnl total in a local variable for later use.
256 dnl
257 ifelse($3,`0',`dnl -- save flag
258     (let lValueTypes = \
259          { $concat($tochar(lValueTypes,decr($2)), \
260                  $tochar(ValueTypeFlag,1), \
261                  $substring(lValueTypes,incr($2),80) ) \
262            where SrvyCensMn = $2, lValueTypes } ) : noprint :dnl
263 ')dnl -- close _ifelse_($3,`0', ... )
264 dnl
265 ifelse($3,`1',`dnl -- print zero-suppress total and one flag from lValueTypes
266 dnl note that required column offset depends on length of mask
267 dnl note that new program line must be a continuation of previous program line
268     $tonumber((let lCumulTot_$2 = $total(NumericValue where SrvyCensMn=$2)),0) \
269                                         :column incr(_COL$1START_) mask "ZZZ,ZZZ,ZZ9": \
270         $tochar($substring(lValueTypes,$2,1),1) : column eval(_COL$1START_+13) :dnl
271 ')dnl -- close _ifelse_($3,`1', ... )
272 dnl
273 ifelse($3,`2',`dnl -- print total without zero-suppress
274     $tonumber((let lCumulTot_$2 = $total(NumericValue where SrvyCensMn=$2)),0) \
275                                         :column incr(_COL$1START_) mask "ZZZ,ZZZ,ZZ9":dnl
276 ')dnl -- close ifelse($3,`2', ... )
277 dnl
278 ')dnl -- close _flagtotal_()
279 dnl
280 dnl ------------------------------------------------------------------------
281 dnl





[LISTING NINE]

   Simplication of the Macro in Listing 8, by Splitting into Three Macros
     Lines longer than 80 characters have been wrapped at 80 characters

  1 dnl
  2 dnl ------------------------------------------------------------------------
  3 dnl
  4 define(`_flagtotal_0_',`dnl -- total 0, save a flag for use with total 1
  5 dnl -- parameters:
  6 dnl   1   Relative column number on the page
  7 dnl   2   Column ID (month)
  8 dnl
  9     (let lValueTypes = \
 10          { $concat($tochar(lValueTypes,decr($2)), \
 11                   $tochar(ValueTypeFlag,1), \
 12                  $substring(lValueTypes,incr($2),80) ) \
 13            where SrvyCensMn = $2, lValueTypes } ) : noprint :dnl
 14 ')dnl
 15 dnl
 16 dnl ------------------------------------------------------------------------
 17 dnl
 18 define(`_flagtotal_1_',`dnl -- print zero-suppress total and one flag from lValueTypes
 19 dnl -- parameters:
 20 dnl   1   Relative column number on the page
 21 dnl   2   Column ID (month)
 22 dnl
 23 dnl   note that required column offset depends on length of mask
 24 dnl   note that new program line must be a continuation of previous program line
 25     $tonumber((let lCumulTot_$2 = $total(NumericValue where SrvyCensMn=$2)),0) \
 26                                         :column incr(_COL$1START_) mask "ZZZ,ZZZ,ZZ9": \
 27         $tochar($substring(lValueTypes,$2,1),1) : column eval(_COL$1START_+13) :dnl
 28 ')dnl
 29 dnl
 30 dnl ------------------------------------------------------------------------
 31 dnl
 32 define(`_flagtotal_2_',`dnl -- print total without zero-suppress
 33 dnl -- parameters:
 34 dnl   1   Relative column number on the page
 35 dnl   2   Column ID (month)
 36 dnl
 37     $tonumber((let lCumulTot_$2 = $total(NumericValue where SrvyCensMn=$2)),0) \
 38                                         :column incr(_COL$1START_) mask "ZZZ,ZZZ,ZZ9":dnl
 39 ')dnl
 40 dnl
 41 dnl ------------------------------------------------------------------------
 42 dnl






[LISTING 10 IS CURRENTLY UNAVAILABLE]












Copyright © 1993, Dr. Dobb's Journal

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.