any pointers to using COBOL for OS/390 TEST(,,SEPARATE) 
Author Message
 any pointers to using COBOL for OS/390 TEST(,,SEPARATE)

First off, let me say that I don't expect to ever see the Debug Tool.

The decision to decouple it from the compilers as a separate cost product
under LE means that it will never be bought for my site. COBTEST and
PLITEST were great, I even combined them with IMS BTS to do interactive
source code level debugging of IMS MPPs, alternating between simulated
IMS screens, BTS command line mode screens, and COBTEST/PLITEST screens
with adjustable areas for displaying source, variables being monitored,
and the PLITEST/COBTEST command log. Who needed Exediter, etc. I don't
expect to ever see that again.

I'm only interested in the new LE abend handling functions which are
independent of the Debug Tool.

The SEPARATE option seems like something to look at for IMS Message
Processing Region, to reduce load module bloat in address spaces where
many different OLTP programs are trying to stay around for reuse. CICS
regions are also notorious for going Short On Storage.

To me the simple solution would have been to specify a DDname to be used
for a library containing members written to SYSDEBUG while compiling.
So far I haven't been able to find any mention of one.

The manuals say that the name of the "separate debug data set" is
recorded in the load module, which seems fine for first level
debugging, but is useless once the source, object code, DB2 Database
Request Modules, etc. get migrated to a Unit Test, Integration test,
or Production environment. In the worst case a production abend could
end up trying to use a new debug file for a new version still under
development.

The debugging manual suggests that the debug tool would prompt for
debug dataset name if it can't find it, and also has some suggestions
about overriding defaults when doing batch testing.

Neither of these suggestions seems to be of any use for using a Separate
Debug Library Data Set member to get enhanced information in routine
production batch where the library names are all different from the
ones used for development.

We use seperate High Level Indexes for each migration level, to help
to avoid having unauthorized staff promote source and object code.
Our philosopy is to migrate what was tested at lower levels, not to
recompile it and hope that it behaves the same as what was tested.
--



Sun, 12 Dec 2004 09:10:20 GMT  
 any pointers to using COBOL for OS/390 TEST(,,SEPARATE)
1) COBTEST was a "cost-added" feature for VS COBOL II since (at least) the
late 80's.  So there is nothing new in this.

2)  I don't understand WHAT use you think you will ever get out of SYM
information (in the same load module or in a separate dataset) *if* you
don't use the Debug Tool?  Can you elaborate?

  ***

If you look at:
 http://publib.boulder.ibm.com/cgi-bin/bookmgr/BOOKS/igypg205/2.4.52?

it says,

"There are two ways that the TEST option can improve your formatted dumps
from Language Environment:

Use the NOSYM suboption of TEST to have line numbers in the dump that
indicate the failing statement, rather than just an offset.

Use the SYM suboption of TEST to have the values of the program variables
listed in the dump.

With NOTEST, the dump will not have program variables and will not have a
line number for the failing statement. "

Is this what you are after?  If so, I assume that you know that using any
value but NOTEST *forces* on NOOPT - and I would NEVER recommend that for
production code.

--
Bill Klein
 wmklein <at> ix.netcom.com


Quote:
> First off, let me say that I don't expect to ever see the Debug Tool.

> The decision to decouple it from the compilers as a separate cost product
> under LE means that it will never be bought for my site. COBTEST and
> PLITEST were great, I even combined them with IMS BTS to do interactive
> source code level debugging of IMS MPPs, alternating between simulated
> IMS screens, BTS command line mode screens, and COBTEST/PLITEST screens
> with adjustable areas for displaying source, variables being monitored,
> and the PLITEST/COBTEST command log. Who needed Exediter, etc. I don't
> expect to ever see that again.

> I'm only interested in the new LE abend handling functions which are
> independent of the Debug Tool.

> The SEPARATE option seems like something to look at for IMS Message
> Processing Region, to reduce load module bloat in address spaces where
> many different OLTP programs are trying to stay around for reuse. CICS
> regions are also notorious for going Short On Storage.

> To me the simple solution would have been to specify a DDname to be used
> for a library containing members written to SYSDEBUG while compiling.
> So far I haven't been able to find any mention of one.

> The manuals say that the name of the "separate debug data set" is
> recorded in the load module, which seems fine for first level
> debugging, but is useless once the source, object code, DB2 Database
> Request Modules, etc. get migrated to a Unit Test, Integration test,
> or Production environment. In the worst case a production abend could
> end up trying to use a new debug file for a new version still under
> development.

> The debugging manual suggests that the debug tool would prompt for
> debug dataset name if it can't find it, and also has some suggestions
> about overriding defaults when doing batch testing.

> Neither of these suggestions seems to be of any use for using a Separate
> Debug Library Data Set member to get enhanced information in routine
> production batch where the library names are all different from the
> ones used for development.

> We use seperate High Level Indexes for each migration level, to help
> to avoid having unauthorized staff promote source and object code.
> Our philosopy is to migrate what was tested at lower levels, not to
> recompile it and hope that it behaves the same as what was tested.
> --



Sun, 12 Dec 2004 21:00:50 GMT  
 any pointers to using COBOL for OS/390 TEST(,,SEPARATE)
Actually, SEPARATE is a sub-option of the TEST compile-time option.  Does your
current compile process use the TEST option for production code?

You can specify exactly what library you want the file to be place in with the
SYSDEBUG DD statement.

For example:    //SYSDEBUG  DD  DISP=SHR,DSN=DAVIN6.PDPAK.SIDEFILE(ANYPROG)

And, as you noted, when you use the TEST compile-time option, the compiler
stores the name of the debug file in the object module.  There are several
methods for pointing to the appropriate library when you need to.  One
recommendation is to make certain that your SCM (software change management)
system is updated to move the sidefile to production along with the load module.
However, the ability to access the contents of the sidefile is only appliable to
and used by the Debug Tool.

If you want enhanced abend detection, then you should really be looking at IBM's
Fault Analyzer.

For more information about both of these products, please take a look at a
recent IBM Redbook:  Introduction to the IBM Problem Determination Tools
(SG24-6296).

A copy can be found at
http://publib-b.boulder.ibm.com/Redbooks.nsf/RedbookAbstracts/sg24629...

Larry Kahm



Sun, 12 Dec 2004 23:18:29 GMT  
 any pointers to using COBOL for OS/390 TEST(,,SEPARATE)

Quote:

> Actually, SEPARATE is a sub-option of the TEST compile-time option.  Does your
> current compile process use the TEST option for production code?

> You can specify exactly what library you want the file to be place in with the
> SYSDEBUG DD statement.

> For example:    //SYSDEBUG  DD  DISP=SHR,DSN=DAVIN6.PDPAK.SIDEFILE(ANYPROG)

Found that one a while back. Too bad someone didn't think of using a similar
DDname at run time to specify a concatenation separate debug file libraries.

Quote:

> And, as you noted, when you use the TEST compile-time option, the compiler
> stores the name of the debug file in the object module.  There are several
> methods for pointing to the appropriate library when you need to.  One
> recommendation is to make certain that your SCM (software change management)
> system is updated to move the sidefile to production along with the load module.
> However, the ability to access the contents of the sidefile is only appliable to
> and used by the Debug Tool.

So SEPARATE is really not of any practical use in production IMS MPPs
and CICS programs, which are the ones where SEPARATE would probably have
the most value.

Quote:

> If you want enhanced abend detection, then you should really be looking at IBM's
> Fault Analyzer.

I'm not so ambitious, I just want to not have to keep showing COBOL developers
and problem analysts how to use the pseudo assembler listing and the reported
offsets to find the failing insruction.

For PL/I the GOSTMT option does the same, and getting them to use that
reduced the traffic stream to my (DB Admin) desk.

I was hoping that the formated variable portion would help to avoid the
BLL nightmare. I've used at least 8 IBM cobol compilers from mainframe
DOS and VS2 back in 1977 to LE COBOL.

Quote:

> For more information about both of these products, please take a look at a
> recent IBM Redbook:  Introduction to the IBM Problem Determination Tools
> (SG24-6296).

> A copy can be found at
> http://publib-b.boulder.ibm.com/Redbooks.nsf/RedbookAbstracts/sg24629...

> Larry Kahm

--


Mon, 13 Dec 2004 09:13:54 GMT  
 any pointers to using COBOL for OS/390 TEST(,,SEPARATE)

Quote:

> First off, let me say that I don't expect to ever see the Debug Tool.

> The decision to decouple it from the compilers as a separate cost product
> under LE means that it will never be bought for my site. COBTEST and
> PLITEST were great, I even combined them with IMS BTS to do interactive
> source code level debugging of IMS MPPs, alternating between simulated
> IMS screens, BTS command line mode screens, and COBTEST/PLITEST screens
> with adjustable areas for displaying source, variables being monitored,
> and the PLITEST/COBTEST command log. Who needed Exediter, etc. I don't
> expect to ever see that again.

If your shop has COBTEST and PLITEST, it paid higher license fees for each
compiler to get them.  With Debug Tool, you would just order one of the
compilers as a "full function" version, and the Debug Tool would be
included.  The nice thing is that it supports both COBOL and PL/1 (plus
C/C++, if anyone cares).  So, it's three for the price of one, and learning
it for COBOL, you'd know most of what you need to know to use it for another
language.  (I hope IBM folds in Assembler Language support someday, too.
They tell me that others have requested this as well.)

Quote:

> I'm only interested in the new LE abend handling functions which are
> independent of the Debug Tool.

> The SEPARATE option seems like something to look at for IMS Message
> Processing Region, to reduce load module bloat in address spaces where
> many different OLTP programs are trying to stay around for reuse. CICS
> regions are also notorious for going Short On Storage.

> To me the simple solution would have been to specify a DDname to be used
> for a library containing members written to SYSDEBUG while compiling.
> So far I haven't been able to find any mention of one.

In the COBOL for OS/390 & VM V2R2 Programming Guide, that is exactly what is
offered.  DDname SYSDEBUG may define a sequential or partitioned data set
(including PDSE), of LRECL 80 to 1024 bytes, RECFM=FB.

Quote:

> The manuals say that the name of the "separate debug data set" is
> recorded in the load module, which seems fine for first level
> debugging, but is useless once the source, object code, DB2 Database
> Request Modules, etc. get migrated to a Unit Test, Integration test,
> or Production environment. In the worst case a production abend could
> end up trying to use a new debug file for a new version still under
> development.

You could recompile for production, after all testing is complete, and
specify the "production" PDSE as the SYSDEBUG DSName.  If you migrate upward
using a Change Control tool like CA-Endevor, you could specify that the tool
should "generate" (meaning compile and program bind) the COBOL at the lowest
level, and again at the highest level, allowing you to keep "production"
SYMbol files inviolate until a new production load module is created.  The
recompile isn't dangerous in this case (my opinion, of course), because
you're under control of the same procedures at each level, and the compile
would use exactly the same options, etc.  It's a matter of being able to work
around the limitations of the feature.

Quote:

> The debugging manual suggests that the debug tool would prompt for
> debug dataset name if it can't find it, and also has some suggestions
> about overriding defaults when doing batch testing.

> Neither of these suggestions seems to be of any use for using a Separate
> Debug Library Data Set member to get enhanced information in routine
> production batch where the library names are all different from the
> ones used for development.

> We use seperate High Level Indexes for each migration level, to help
> to avoid having unauthorized staff promote source and object code.
> Our philosopy is to migrate what was tested at lower levels, not to
> recompile it and hope that it behaves the same as what was tested.
> --

If I were you, I'd campaign with my management for the "right tools to do the
job".  Be sure you outline the advantages of having the Debug Tool, and of
using the TEST option.  Oftentimes, managers can be brought around to
agreeing with their technicians.  Then, they'll push the request upward for
you to the guy who approves spending the dough


Tue, 14 Dec 2004 05:11:18 GMT  
 any pointers to using COBOL for OS/390 TEST(,,SEPARATE)
Bill,
I tried to reply once already; I hope this doesn't show up twice.

SYM information is useful even without Debug Tool, because having it at the
time of an abend results in the source line number where an abend occurred
appearing in the message file (SYSOUT) and in CEEDUMP.  (This is in addition to
the offset into the program.)

Then, CEEDUMP generation will include a formatted listing of the abending
program's Data Division.  Without this output, I consider CEEDUMP basically
useless to most programmers.  With it, I consider that a SYSUDUMP is not
needed, and shops that have Abend-Aid, DumpMaster, or other such tools, are
close to wasting their money.  Programmers can usually solve their problems
with these two items.
Colin

Quote:

> 1) COBTEST was a "cost-added" feature for VS COBOL II since (at least) the
> late 80's.  So there is nothing new in this.

> 2)  I don't understand WHAT use you think you will ever get out of SYM
> information (in the same load module or in a separate dataset) *if* you
> don't use the Debug Tool?  Can you elaborate?

>   ***

> If you look at:
>  http://publib.boulder.ibm.com/cgi-bin/bookmgr/BOOKS/igypg205/2.4.52?

> it says,

> "There are two ways that the TEST option can improve your formatted dumps
> from Language Environment:

> Use the NOSYM suboption of TEST to have line numbers in the dump that
> indicate the failing statement, rather than just an offset.

> Use the SYM suboption of TEST to have the values of the program variables
> listed in the dump.

> With NOTEST, the dump will not have program variables and will not have a
> line number for the failing statement. "

> Is this what you are after?  If so, I assume that you know that using any
> value but NOTEST *forces* on NOOPT - and I would NEVER recommend that for
> production code.

> --
> Bill Klein
>  wmklein <at> ix.netcom.com


> > First off, let me say that I don't expect to ever see the Debug Tool.

> > The decision to decouple it from the compilers as a separate cost product
> > under LE means that it will never be bought for my site. COBTEST and
> > PLITEST were great, I even combined them with IMS BTS to do interactive
> > source code level debugging of IMS MPPs, alternating between simulated
> > IMS screens, BTS command line mode screens, and COBTEST/PLITEST screens
> > with adjustable areas for displaying source, variables being monitored,
> > and the PLITEST/COBTEST command log. Who needed Exediter, etc. I don't
> > expect to ever see that again.

> > I'm only interested in the new LE abend handling functions which are
> > independent of the Debug Tool.

> > The SEPARATE option seems like something to look at for IMS Message
> > Processing Region, to reduce load module bloat in address spaces where
> > many different OLTP programs are trying to stay around for reuse. CICS
> > regions are also notorious for going Short On Storage.

> > To me the simple solution would have been to specify a DDname to be used
> > for a library containing members written to SYSDEBUG while compiling.
> > So far I haven't been able to find any mention of one.

> > The manuals say that the name of the "separate debug data set" is
> > recorded in the load module, which seems fine for first level
> > debugging, but is useless once the source, object code, DB2 Database
> > Request Modules, etc. get migrated to a Unit Test, Integration test,
> > or Production environment. In the worst case a production abend could
> > end up trying to use a new debug file for a new version still under
> > development.

> > The debugging manual suggests that the debug tool would prompt for
> > debug dataset name if it can't find it, and also has some suggestions
> > about overriding defaults when doing batch testing.

> > Neither of these suggestions seems to be of any use for using a Separate
> > Debug Library Data Set member to get enhanced information in routine
> > production batch where the library names are all different from the
> > ones used for development.

> > We use seperate High Level Indexes for each migration level, to help
> > to avoid having unauthorized staff promote source and object code.
> > Our philosopy is to migrate what was tested at lower levels, not to
> > recompile it and hope that it behaves the same as what was tested.
> > --



Tue, 14 Dec 2004 05:18:49 GMT  
 any pointers to using COBOL for OS/390 TEST(,,SEPARATE)
Thank you  Colin.

I also wanted to acknowledge a correction to my original note sent to me by
private email.

If you specify *either* NOTEST or TEST(NONE ...) - then you may still use
the OPT compiler option.

--
Bill Klein
 wmklein <at> ix.netcom.com

Quote:
> Bill,
> I tried to reply once already; I hope this doesn't show up twice.

> SYM information is useful even without Debug Tool, because having it at
the
> time of an abend results in the source line number where an abend occurred
> appearing in the message file (SYSOUT) and in CEEDUMP.  (This is in
addition to
> the offset into the program.)

> Then, CEEDUMP generation will include a formatted listing of the abending
> program's Data Division.  Without this output, I consider CEEDUMP
basically
> useless to most programmers.  With it, I consider that a SYSUDUMP is not
> needed, and shops that have Abend-Aid, DumpMaster, or other such tools,
are
> close to wasting their money.  Programmers can usually solve their
problems
> with these two items.
> Colin


> > 1) COBTEST was a "cost-added" feature for VS COBOL II since (at least)
the
> > late 80's.  So there is nothing new in this.

> > 2)  I don't understand WHAT use you think you will ever get out of SYM
> > information (in the same load module or in a separate dataset) *if* you
> > don't use the Debug Tool?  Can you elaborate?

> >   ***

> > If you look at:
> >  http://publib.boulder.ibm.com/cgi-bin/bookmgr/BOOKS/igypg205/2.4.52?

> > it says,

> > "There are two ways that the TEST option can improve your formatted
dumps
> > from Language Environment:

> > Use the NOSYM suboption of TEST to have line numbers in the dump that
> > indicate the failing statement, rather than just an offset.

> > Use the SYM suboption of TEST to have the values of the program
variables
> > listed in the dump.

> > With NOTEST, the dump will not have program variables and will not have
a
> > line number for the failing statement. "

> > Is this what you are after?  If so, I assume that you know that using
any
> > value but NOTEST *forces* on NOOPT - and I would NEVER recommend that
for
> > production code.

> > --
> > Bill Klein
> >  wmklein <at> ix.netcom.com


> > > First off, let me say that I don't expect to ever see the Debug Tool.

> > > The decision to decouple it from the compilers as a separate cost
product
> > > under LE means that it will never be bought for my site. COBTEST and
> > > PLITEST were great, I even combined them with IMS BTS to do
interactive
> > > source code level debugging of IMS MPPs, alternating between simulated
> > > IMS screens, BTS command line mode screens, and COBTEST/PLITEST
screens
> > > with adjustable areas for displaying source, variables being
monitored,
> > > and the PLITEST/COBTEST command log. Who needed Exediter, etc. I don't
> > > expect to ever see that again.

> > > I'm only interested in the new LE abend handling functions which are
> > > independent of the Debug Tool.

> > > The SEPARATE option seems like something to look at for IMS Message
> > > Processing Region, to reduce load module bloat in address spaces where
> > > many different OLTP programs are trying to stay around for reuse. CICS
> > > regions are also notorious for going Short On Storage.

> > > To me the simple solution would have been to specify a DDname to be
used
> > > for a library containing members written to SYSDEBUG while compiling.
> > > So far I haven't been able to find any mention of one.

> > > The manuals say that the name of the "separate debug data set" is
> > > recorded in the load module, which seems fine for first level
> > > debugging, but is useless once the source, object code, DB2 Database
> > > Request Modules, etc. get migrated to a Unit Test, Integration test,
> > > or Production environment. In the worst case a production abend could
> > > end up trying to use a new debug file for a new version still under
> > > development.

> > > The debugging manual suggests that the debug tool would prompt for
> > > debug dataset name if it can't find it, and also has some suggestions
> > > about overriding defaults when doing batch testing.

> > > Neither of these suggestions seems to be of any use for using a
Separate
> > > Debug Library Data Set member to get enhanced information in routine
> > > production batch where the library names are all different from the
> > > ones used for development.

> > > We use seperate High Level Indexes for each migration level, to help
> > > to avoid having unauthorized staff promote source and object code.
> > > Our philosopy is to migrate what was tested at lower levels, not to
> > > recompile it and hope that it behaves the same as what was tested.
> > > --



Tue, 14 Dec 2004 06:14:31 GMT  
 any pointers to using COBOL for OS/390 TEST(,,SEPARATE)
Bill,
SYM data is also useful if a program abends.  It causes the failing statement
number to print in both the message file and in CEEDUMP.  In addition, CEEDUMP
will produce a formatted dump of the Data Division.  (Without that, CEEDUMP
seems almost totally useless to me!)
Quote:

> 1) COBTEST was a "cost-added" feature for VS COBOL II since (at least) the
> late 80's.  So there is nothing new in this.

> 2)  I don't understand WHAT use you think you will ever get out of SYM
> information (in the same load module or in a separate dataset) *if* you
> don't use the Debug Tool?  Can you elaborate?

>   ***

> If you look at:
>  http://publib.boulder.ibm.com/cgi-bin/bookmgr/BOOKS/igypg205/2.4.52?

> it says,

> "There are two ways that the TEST option can improve your formatted dumps
> from Language Environment:

> Use the NOSYM suboption of TEST to have line numbers in the dump that
> indicate the failing statement, rather than just an offset.

> Use the SYM suboption of TEST to have the values of the program variables
> listed in the dump.

> With NOTEST, the dump will not have program variables and will not have a
> line number for the failing statement. "

> Is this what you are after?  If so, I assume that you know that using any
> value but NOTEST *forces* on NOOPT - and I would NEVER recommend that for
> production code.

> --
> Bill Klein
>  wmklein <at> ix.netcom.com


> > First off, let me say that I don't expect to ever see the Debug Tool.

> > The decision to decouple it from the compilers as a separate cost product
> > under LE means that it will never be bought for my site. COBTEST and
> > PLITEST were great, I even combined them with IMS BTS to do interactive
> > source code level debugging of IMS MPPs, alternating between simulated
> > IMS screens, BTS command line mode screens, and COBTEST/PLITEST screens
> > with adjustable areas for displaying source, variables being monitored,
> > and the PLITEST/COBTEST command log. Who needed Exediter, etc. I don't
> > expect to ever see that again.

> > I'm only interested in the new LE abend handling functions which are
> > independent of the Debug Tool.

> > The SEPARATE option seems like something to look at for IMS Message
> > Processing Region, to reduce load module bloat in address spaces where
> > many different OLTP programs are trying to stay around for reuse. CICS
> > regions are also notorious for going Short On Storage.

> > To me the simple solution would have been to specify a DDname to be used
> > for a library containing members written to SYSDEBUG while compiling.
> > So far I haven't been able to find any mention of one.

> > The manuals say that the name of the "separate debug data set" is
> > recorded in the load module, which seems fine for first level
> > debugging, but is useless once the source, object code, DB2 Database
> > Request Modules, etc. get migrated to a Unit Test, Integration test,
> > or Production environment. In the worst case a production abend could
> > end up trying to use a new debug file for a new version still under
> > development.

> > The debugging manual suggests that the debug tool would prompt for
> > debug dataset name if it can't find it, and also has some suggestions
> > about overriding defaults when doing batch testing.

> > Neither of these suggestions seems to be of any use for using a Separate
> > Debug Library Data Set member to get enhanced information in routine
> > production batch where the library names are all different from the
> > ones used for development.

> > We use seperate High Level Indexes for each migration level, to help
> > to avoid having unauthorized staff promote source and object code.
> > Our philosopy is to migrate what was tested at lower levels, not to
> > recompile it and hope that it behaves the same as what was tested.
> > --



Tue, 14 Dec 2004 04:45:32 GMT  
 
 [ 8 post ] 

 Relevant Pages 

1. OS/390 release test periods Re: default variable initialization under os/390 v2r8

2. Cobol OS/390 to C OS/390 V2R6

3. COBOL on OS/390, pointer manipulation

4. Help - Barcode using COBOL OS/390

5. Cobol 390 2.2 and Cobol Z/Os 3.1

6. Calling a non-COBOL program from a COBOL program on OS/390

7. In search of a PL/I - COBOL for OS/390 comparison

8. In search of a PL/I - COBOL for OS/390 comparison

9. In search of a PL/I - COBOL for OS/390 comparison

10. OS/390 COBOL copybook cross reference

11. COBOL, JCL and OS/390

12. COBOL, JCL AND OS/390

 

 
Powered by phpBB® Forum Software