Thursday, April 4, 2024

Did you know Synon 2E has aFull Screen Mode?

Long time IBMi users typically work with an expanded WRKACTJOB screen shortly after learning all the command keys that are of use.  As a junior developer (back in the early nineties), I was always in awe of the seniors around me who didn't need to toggle through the command key options whilst managing the jobs running in the system.




As always though, once you had learnt the system you could press F21 to remove the subfile options and command key text, thus freeing up more room for the critical data on these somewhat limited 5250 displays. 

If I remember rightly, the command key used to bring up a window and you could chose between Basic (might have been intermediate) and Advanced mode.  This has long since been removed from the o/s and the command key merely acts as a toggle nowadays.  The reward of course was the ability to see more of your processes and interactive sessions.

Here is my current 'Pro' setup lol.





Synon aka 2e with various prefixes, Cool: Allfusion: Advantage: CA: etc also has the ability for a user to see more on selected screens.  Here are a couple that might be of interest. 

Action Diagram Expanded vs Normal view.





Most developers could survive with this as virtually all worth their salt will know every option and command key off the top of their head anyhow.

A lesser-known scenario and also a bit harder to remember all the options, is the ability to extend the records shown when viewing a model list. 




Personally, I don't have this one switched on but kudos to anyone out there that does!

So how do you set these values?

From the main Services Menu take option 11 for Edit model profile (YEDTMDLPRF) or as the menu implies, execute the command.  Then set the options accordingly.




You're welcome.
Thanks for reading.
Lee.

Sunday, March 17, 2024

They say a picture is a 1000 words!

A thousand?........... In this instance it is more like tens of thousands.

Anyhow, a colleague was talking about tag clouds the other day and it got me thinking.

"What would this blogs labels look like in a tag cloud?"

So......... I popped online and googled for a tag cloud generator and discovered this site that looked pretty good.

https://simplewordcloud.com/ or you can click here:  Simple Word Cloud Generator

After some Excel wizardry to generate the source data. Roughly translated as a screen scrape, removing some spaces, duplicating entries relative to the blog usage count, copying out the formatted data etc.

I excitedly 'cut n pasted' it into the input field on the website and clicked the button.

Results as below.


Thanks for reading.
Lee.

Thursday, March 7, 2024

Understanding CONstant usages in 2e.

One of my colleagues was trying to track down some (potentially) hardcoded string values/literals.  She had already used the SQL queries in one of my previous posts to narrow down the usages inside messages, however, she was also concerned that there might be some underlying hardcoding via CND and more specifically via CON values.

As a side, IMHO there should be next to no CON usages in a well architected model apart from, values like *BLANK, 0.00 and some tolerance towards values like 12 (months), 52 (weeks), 365 (days), 100 (percentage).  I'd even accept 1,2,3, 9999999 etc.

I draw the line at meaningful literals that could have been scoped conditions (CND) or database values.

My main reason for this is that there isn't a way (via the 2E tool interface) to check for usages of CON values.  Obviously, we can scan the generated source, but then what about all the genuine field names or comments.

Back with 2E!  When generating the source, it somehow lets the generator knows about them and adds them into the code (inline) or as part of a data structure.  Take a look at the bottom of a source listing if you don't believe me.

Here is a sample function using some constants.  Note the 'Create Flatfile' call is passing in 'FLAT FILE CoNsTANT' as CON.


And here are the relevant code snippets of generated source i.e. The moves.




Followed by the structure of CON values at the bottom.  Note: Sometimes the CON is generated 'inline' at the point of the MOVE.



Anyhow, in order for the CON values to be consistently applied, they must be in the 2E model data files somewhere, with the theory being we should also be able to query them.  As CON can only be applied via action diagram text, let's start by looking there in the YMSGACTRFP file.



It looks like we have some constants referred to in field ELMTTL ** (Element Title - highlighted RED).  This could be a good start but there are some obvious limitations. Whilst it looks like it covers basic *MOVE and *CONCAT field assignments, when we have hardcoded values going into a function call (PURPLE), the CON value isn't reflected in the ELMTTL field.  

**It would appear that the ELMTTL data is used for the AD layout.  Perhaps I will expand more on the complexities of this data in a future post!!!

However, it looks like 2E maintains a surrogate reference for each unique CON value used in the AD regardless of AD syntax used.  (GREEN)

This is awesome, but how are these linked via the function AD?  

To do this, let's take a gander at a subset of the 2E files in the model library. The one highlighted below looks like it could be useful and most likely, a file most dev's didn't even realise existed.  Hands up who has explored the intricacies of the model schema.


What does the data look like in the table?


Bingo, we have our constants.  

Luckily ours are at the bottom of the file :-) for easier blog documentation.  Diving deeper, we have now confirmed that 2E maintains a surrogate reference for each unique CON value and if we refer back to the YMSGACTRFP image above, these are all highlighted in the GREEN section.

So, all that is left for us to do is to identify the CON values we want to analyse, work out what functions they are linked to via the AD code and some pretty cool impact analysis is before us.

Here is a sample SQL retrieving the many different usages of the term 'constant'.  As CON is only ever associated with field @@SUB3, the query is quite straight forward.

Ensuring you have the correct library list.

SELECT DISTINCT b.@@msg, c.OBJNME, b.@@SUB3, a.CON, c.OBJOWN    
FROM YCONDTARFP a LEFT JOIN YMSGACTRFP b ON a.@@CON = 
b.@@SUB3 LEFT JOIN YMDLOBJRFP c on b.@@MSG = c.@@OBJ  
WHERE UPPER(a.CON) LIKE '%CONSTANT%' AND c.CUROBJ = 'Y'                       

This query returns unique records for each of the functions using CON context with the word 'constant', the UPPER() function ensures we don't miss any based-on case differences.  The rest of the query does a basic join to return some additional fields from YMDLOBJRFP (This is the *ALLOBJ or *A model list).  You can add to this query or tidy up names and formatting however you like.


I hope that this helps someone.

Thanks for reading.
Lee.


Wednesday, March 6, 2024

Exploring 2E messages via SQL (Update)

Last year I made a small post about querying the underlying 2E files for messages, this use case came up at my current contract today and a small improvement can be made.

Here is the original post for some more context.

Lee Dare - Software Development Principles, Plex/2e and everything else in between.: Exploring 2E messages via SQL (leedare-plex2e.blogspot.com)

In addition to the above, I recommend the SQL syntax is improved by catering for 'Mixed Case'

SELECT a.@@MSG, a.MSG, a.TYPOPT, b.SECLVL                     
FROM YMSGDTARFP a                                             
LEFT JOIN YMSGSECRFP b ON a.@@MSG = b.@@MSG                   
WHERE UPPER(a.MSG) like '%KD33%' OR                           
UPPER(a.TYPOPT) LIKE '%KD33%' OR UPPER(b.SECLVL) LIKE '%KD33%'


The actual query above is checking for the message name in the 2E model, the default message text and the second level text associated with the message.

Thanks for reading and as always, interested in any feedback.
Lee.

Saturday, December 2, 2023

It's (ELeM)entary, My Dear Watson

I've used this trick a few times so I thought I should share it on the blog before I get run over by a bus.

Often we (developers) are asked to produce or consume data files for integrations with 3rd party systems.  Even with the emergence of web services there are still many many times where the preferred method of interchange is file based, typically these are bulk (high volume) transactional exchanges.  Think payments between banks etc 

For the systems I've worked on we have solved this with .csv delimited inbound and outbound files.  We have also utilised fixed width files, the majority having old school multi-format layouts with a traditional header, detail and footer structure which has some advantages over .csv.

For now, I will also ignore that we could have also provided XML or JSON payloads as part of a real-time integration. 

As you can see, there are numerous ways to skin this cat. However, this is a 2E and Plex blog so for today, let us concentrate on:-

  • How you might build and consume files into a 2E based application. 
  • How can we create a multi-format flat file using purely Synon logic only?

Let's fire up a 5250 session and explore an unsung feature in 2E that helps remove the complexity of building these flat files.

What are we going to build? 


An extract file with a header record, 10 detailed transactional records and a footer which denotes both EOF as well as have a total  for basic validation.  In the real-word this may include hash total etc

An example of what this file data might look like is below.


Back in 2E, f
irst define a file with one Known By (Numeric 9.0) should be suffice for most integrations and a second field (Has relation) of 'Flatfile Data' - Type TXT and length of 500 (or whatever length works for your environment). RP4 has better field limits than RPG. 

I've called my file FIXFILP.




Now we need to build the data record, usually you would start concatenating the data values together to meet the design specification of the receiving system and handle those pesky FILLER57 fields etc that always appear from somewhere.

This involves dozens (if not hundreds) of lines of code to concatenate the data.  The resulting action diagram is often difficult to understand and cumbersome to maintain.

What if there was an easier way to build up the data with the ability to easily reorder the fields and cater for changing field lengths.  Well there is, using a neat unsung feature of 2E arrays.

2E keeps track of the last record to be added, changed or read in an array, a sort of cursor I guess.  This is available in action diagram logic hidden in the ELM context. (ELeMent).

The only 'built in' function that can use the ELM context is a *CVTVAR.  

First create an array with the data fields you would like to appear in the dataset, this can be header, detail 1, detail 2, footer etc.  It doesn't really matter for a flat file process.  To keep it nice and simple I have made up some generic fields names with various data types.







I've keyed these arrays based on the Record type of HDR, DTL and FTR.  You can do how best suits your use case.  All the arrays are set with a 'Numer of Elements' of 1 record as I don't need to use the in the traditional sense.  I just need a pointer to an ELM in memory.



All we then do is call a CRTOBJ over the array to populate the data.  Once in the array, we can use the *CVTVAR to populate a flat file field.  2E handles all the different data types and spits out a well formatted string which you can write to the database/extract file etc



But we are not done.  I've ready other blogs that talk about ELM and they do a pretty good job of explaining the outbound example above.  But not many people realise that depending on whether you are using the ELM context as Input or Output, is the equivalent of constructing or deconstructing the data.  So yes, this method can be used to unpack flat files also. :-)

As long as in the receiving function you have created a shell array record.  You can use ELM to move the data into the array and then standard RTVOBJ functionality to retrieve the record in its deconstructed form.




An example below of a couple of screens that I rushed together showing the data string and the subsequent fields.



You simply point the flatfile string to the correct array and 2E will handle everything else.

Thanks for reading.
Lee.


Friday, September 29, 2023

This doesn't get old.



More than 30 years after first seeing this, I appreciate it more and more.  It's like a fine wine maturing quietly in a cellar.

Lee.

Friday, June 30, 2023

Exploring 2E messages via SQL

Bitesized post today.

I needed to query the model to see if a particular function was called in second level message text.  I also wanted to see if it was referenced in the message name or the message text just to make sure I tracked down all usages.

The simple solution was to join the two files from the 2E model and do a little check to see if my string (PGM in this instance) was mentioned.

SELECT a.@@MSG, a.MSG, a.TYPOPT, b.SECLVL FROM YMSGDTARFP a LEFT JOIN YMSGSECRFP b ON a.@@MSG = b.@@MSG WHERE a.MSG like '%MYSTRING%' OR a.TYPOPT LIKE '%MYSTRING%' OR b.SECLVL LIKE '%MYSTRING%'                   

For more on the underlying files in 2E, take a look at these posts from many years ago.

https://leedare-plex2e.blogspot.com/search/label/model%20files

Thanks for reading.
Lee.

p.s. Remember to have your library list pointing to the correct model!

Thursday, May 25, 2023

Object Usage via SQL

Continuing the series on using IBM i SQL for some basic work management tasks.  Today I had a large list of objects that I wanted to understand the date that they were 'last used' and 'days used' count in order to determine if :-

1. The objects could be removed or flagged as obsolete.
2. Prioritise the development/testing based on high volume activity.

Usually, I would use the Work with Object (WRKOBJ) command and work through them one by one capturing the data in excel or notepad manually.  Another option is to build a list of objects on the system using the OUTFILE keyword from the DSPOBJD command and then query these via SQL or Query/400 or simply by using YWRKF our trusted friend in the 2E world.

Today however,  I had a list of around 50 or so objects and as we have separate development and production machines (the latter, I have no access to via segregation of duties policies that I support BTW), I felt it was unfair to ask a colleague to send me 50 screen prints, as well as, this being prone to user error with so many commands to execute.

I could have documented the steps for the traditional method above but this isn't really repeatable for my colleagues nor is it enjoyable.  Therefore, I decide to write an SQL (or four) as it turns out to get me the data and leave us with a template for the future.  There is a bonus 5th one for us 2Er's

There is a bump in the road though.  Isn't there always aye!

Even though IBM have made huge strides with the availability of data via SQL in recent releases.  A simple view for the OBJECT_STATISTICS does not exist.  There is a table function which will get you the data but is precluded on obtaining data for an entire library or a subset based on object type. 

Here is an example straight from the IBM documentation.


When I applied this to my test library, I didn't want the timestamp but a date only and I wanted a subset of the fields available.

This is quite a simple modification to the IBM example, we just do a substring at take the first 10 characters and give the field as nice name.  Note also the replacement of the asterisk (*) for the select to exact fields.

SELECT OBJNAME, DAYS_00001, 

SUBSTR(VARCHAR(LAST_00001), 1, 10) AS LAST_USED

 FROM table                                    

(object_statistics('MYLIB', '*ALL'))   


Unfortunately, I hit a hurdle when trying to view the resulting data via YWRKF.  The 2E programs raised an error condition and threw a hissy fit as I had some null dates represented as '-' in the returned data and not the ISO format (0001-01-01) it was expecting.

A quick google later and an a minor adjustment to the SQL to present null as an empty ISO date and I was now able to view the data.  FYI, Query/400 was fine with the data, so this is an extra steps for loyal 2E users/shops like me/us.  There are also other date conversion routines readily available I just chose this method for now....

SELECT OBJNAME, DAYS_00001,                                         

IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED

FROM table                                                        

(object_statistics('MYLIB', '*ALL'))


This is all well and good, but I also wanted to restrict this SQL result to the objects that were of interest.  I achieved this by using a SQL and filter out those of interest.

SELECT OBJNAME, DAYS_00001,                                        

IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED

FROM table                                                        

(object_statistics('MYLIB', '*ALL')) a WHERE a.OBJNAME in      

('MYOBJ01', 'MYOBJ02', 'MYOBJ03', 'MYOBJ04', 'MYOBJ05',            

 'MYOBJ06', 'MYOBJ07', 'MYOBJ08', 'MYOBJ09', 'MYOBJ10')


Whilst this is great it is only a view on the screen.  I didn't want my colleague to have to take screen prints or scrape the screen with tedious cut and paste to build an excel file for me.

CREATE TABLE QTEMP/OBJ_USAGE AS(
SELECT OBJNAME, DAYS_00001,
IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED
FROM table
(object_statistics('MYLIB', '*ALL')) a WHERE a.OBJNAME in
('MYOBJ01', 'MYOBJ02', 'MYOBJ03', 'MYOBJ04', 'MYOBJ05', 'MYOBJ06', 'MYOBJ07', 'MYOBJ08', 'MYOBJ09', 'MYOBJ10')

) WITH DATA


This was great for my purposes, I sent the SQL to a colleague, and they were able to implement it and send me a file back.  Job done.

Hold Tight!! 

This is a 2E blog (mainly) and I have only mentioned YWRKF.  You can do better than that I hear you cry.

The above list of objects can easily be expanded or substituted with a list from a file which you could build somehow.  What if I linked this to a Model List?  That would be cool right!

Whilst this isn't an option for my site due to having separate machines, it might work for you.  If you are like us, you may have send over some data periodically from production and work on the queries from a different angle.  Anyhow, assuming you have model lists and objects you wish to query on the same machine you can embed your SQL in RPG, CL or as a SQL source member and run with RUNSQLSTM etc.

You just need to link the Model list with your library of objects.  See below for one method.

We create an alias of the member we wish to use as our list.

CREATE OR REPLACE ALIAS QTEMP/OBJ_USAGE FOR MYMODEL/YMDLLSTRFP(MYLIST)

Execute the query.  Optionally as above, you can output to a file if you wish.

SELECT OBJNAME, DAYS_00001, IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED
FROM TABLE (OBJECT_STATISTICS('MYLIB', '*ALL')) a LEFT JOIN QTEMP/OBJ_USAGE a ON a.OBJNAME = b.IMPNME WHERE  b.OBJTYP = 'FUN'

Lastly, tidy up after ourselves and drop the temporary ALIAS.

DROP ALIAS QTEMP/OBJ_USAGE


As always, I am sure that there are other ways of solving this problem and I would love to hear about them in the comments.  This is my current 'new method' and will likely change as I notice more and more flaws or need to expand the scope.

Thanks for reading.
Lee.

Thursday, April 13, 2023

A few more little SQL's for IBM i

QSYS2 is a library that has lots of system data readily available.  More and more data that has traditionally been the reserve of a particular command or API is now available for query.  These views have been around for years and IBM are adding more and every release.

The SQL view will have a long name equivalent for easier SQL readability, as well as, the standard IBM object notation for use in tools like Query/400 rather than SQL statements. (See Table as the end of this blog)

For example the file for querying data area's on the system is 'DATA_AREA_INFO' in SQL parlance but 'DTAARA_INF' when referred to directly as an object.



Let's take a look at some sample for the kind of data you can mine on your IBMi


Who is taking up spool file space?

SELECT USER_NAME, SUM(SIZE) AS TOTAL_SPOOL_SPACE  
  FROM QSYS2.OUTPUT_QUEUE_ENTRIES_BASIC           
  GROUP BY USER_NAME                              
  ORDER BY TOTAL_SPOOL_SPACE DESC LIMIT 25        

Do we have very large spool files that can be purged?

SELECT * FROM QSYS2.OUTPUT_QUEUE_ENTRIES_BASIC ORDER BY SIZE DESC
FETCH FIRST 100 ROWS ONLY                                        

Do we have some really old ones?

SELECT * FROM QSYS2.OUTPUT_QUEUE_ENTRIES_BASIC
ORDER BY CREATE_TIMESTAMP                     
FETCH FIRST 100 ROWS ONLY                     

What Data area's (*DTAARA) are there in a given library and their values?

SELECT DATA_AREA_NAME, DATA_AREA_VALUE
FROM QSYS2.DATA_AREA_INFO             
WHERE DATA_AREA_LIBRARY = 'YOUR_LIB

What about the current library list?

SELECT DATA_AREA_LIBRARY, DATA_AREA_NAME, DATA_AREA_VALUE
FROM QSYS2.DATA_AREA_INFO                                
WHERE DATA_AREA_LIBRARY                                  
IN (SELECT SCHEMA_NAME FROM QSYS2.LIBRARY_LIST_INFO)     

I'm only interested in the user portion!

SELECT DATA_AREA_LIBRARY, DATA_AREA_NAME, DATA_AREA_VALUE
FROM QSYS2.DATA_AREA_INFO                                
WHERE DATA_AREA_LIBRARY                                  
IN (SELECT SCHEMA_NAME FROM QSYS2.LIBRARY_LIST_INFO      
    WHERE TYPE = 'USER')                                 

Want to know the *JOBD's that use a particular library?

SELECT JOB_DESCRIPTION_LIBRARY, JOB_DESCRIPTION, LIBRARY_LIST
  FROM QSYS2.JOB_DESCRIPTION_INFO                            
  WHERE LIBRARY_LIST LIKE '%YOUR_LIB%'                      

Profiles with *SECADM that are *ENABLED

SELECT * FROM QSYS2.USER_INFO               
  WHERE SPECIAL_AUTHORITIES LIKE '%*ALLOBJ%'
  AND STATUS = '*ENABLED' ORDER BY PREVIOUS_SIGNON

Stagnant user profiles?  Assist with your security compliance.

SELECT * FROM QSYS2.USER_INFO                      
WHERE STATUS = '*ENABLED' ORDER BY PREVIOUS_SIGNON 

To explore for yourselves take a look at this page, the above is really the tip of the iceberg when it comes to managing your system.


Or look for the files in the QSYS2 library and google the documentation.  You may well find this useful link.


Worst case you have the tried and tested WRKOBJ discovery method.

WRKOBJ QSYS2/*ALL *FILE

Thanks for reading.
Lee.

This is a summary of the files available for SQL.  We've only just scratched the surface with these.  We are only limited by our imagination with what tasks we can automate, improve etc.

SQL                         Object
ASP_INFO                 ASP_INFO
ASP_JOB_INFO                 ASPJ_INFO
ASP_VARY_INFO                 VARY_INFO
AUTHORITY_COLLECTION         AUTH_COL
AUTHORITY_COLLECTION_DLO AUTH_COLDL
AUTHORITY_COLLECTION_FSOBJ AUTH_COLFS
AUTHORITY_COLLECTION_LIBRARIES AUTH_COLLI
AUTHORITY_COLLECTION_OBJECT AUTH_COLOB
AUTHORIZATION_LIST_INFO         AUTHL_INFO
AUTHORIZATION_LIST_USER_INFO AUTL_USERS
AUTOSTART_JOB_INFO         AUTOJ_INFO
BINDING_DIRECTORY_INFO         BNDDIR_INF
BOUND_MODULE_INFO         MODULE_INF
BOUND_SRVPGM_INFO         SRVPGM_INF
COLLECTION_SERVICES_INFO CS_INFO
COMMAND_INFO                 CMD_INFO
COMMUNICATIONS_ENTRY_INFO COMM_INFO
DATA_AREA_INFO                 DTAARA_INF
DATA_QUEUE_INFO                 DTAQ_INFO
DATABASE_MONITOR_INFO         DBMON_INFO
DB_TRANSACTION_INFO         TRANS_INFO
DRDA_AUTHENTICATION_ENTRY_INFO DRDA_AUTHE
ELECTRONIC_SERVICE_AGENT_INFO ESA_INFO
ENVIRONMENT_VARIABLE_INFO ENV_VARS
EXIT_POINT_INFO                 EXIT_POINT
EXIT_PROGRAM_INFO         EXIT_PGM
FUNCTION_INFO                 FCN_INFO
FUNCTION_USAGE                 FCN_USAGE
HARDWARE_RESOURCE_INFO         HW_INFO
HTTP_SERVER_INFO         HTTP_SRVR
JOB_DESCRIPTION_INFO         JOBD_INFO
JOB_QUEUE_INFO                 JOBQ_INFO
JOURNAL_INFO                 JRNINFO
JOURNAL_INHERIT_RULES         LIB_JRN
JOURNAL_RECEIVER_INFO         JRNRCV_INF
JOURNALED_OBJECTS         JRN_OBJS
LIBRARY_LIST_INFO         LIBLIST
LICENSE_INFO                 LIC_INFO
LOCKING_POLICY_INFO         LOCK_POL
MEDIA_LIBRARY_INFO         MEDIA_INFO
MESSAGE_FILE_DATA         MSGF_DATA
MESSAGE_QUEUE_INFO         MSGQ_INFO
NETSTAT_INFO                 NS_INFO
NETSTAT_INTERFACE_INFO         NS_INTER
NETSTAT_JOB_INFO         NS_JOB
NETSTAT_ROUTE_INFO         NS_ROUTE
NVME_INFO                 NVME_INFO
OBJECTCONNECT_INFO         OBJC_INFO
OBJECT_LOCK_INFO         OBJ_LOCK
OBJECT_OWNERSHIP         OBJ_OWN
OBJECT_PRIVILEGES         OBJ_PRIV
OUTPUT_QUEUE_ENTRIES         OUTQ_INFO
OUTPUT_QUEUE_ENTRIES_BASIC OUTQ_INFOB
OUTPUT_QUEUE_INFO         OUTQ_DTL
PRESTART_JOB_INFO         PREJ_INFO
PROGRAM_EXPORT_IMPORT_INFO EXPIMP_INF
PROGRAM_INFO                 PGM_INFO
QUERY_SUPERVISOR         QRY_SUPER
                        QRY_SUPERI
                        QRY_SUPERT
                        QRY_SUPERX
RECORD_LOCK_INFO         RCD_LOCK
REMOTE_JOURNAL_INFO         RMT_JRNS
REPLY_LIST_INFO                 REPLYLIST
ROUTING_ENTRY_INFO         RTG_INFO
SCHEDULED_JOB_INFO         SCHED_JOB
SECURITY_INFO                 SEC_INFO
SERVER_SBS_CONFIGURATION SERVER_CFG
SERVER_SBS_ROUTING         SRVR_RTG
SERVER_SHARE_INFO         SHARE_INFO
SERVICES_INFO                 SERV_INFO
SOFTWARE_PRODUCT_INFO         SFW_PROD
SUBSYSTEM_INFO                 SBS_INFO
SUBSYSTEM_POOL_INFO         SBS_POOL
SYSLIMITS                 SYSLIMITS
SYSLIMITS_BASIC                 SYSLIMIT_B
SYSPACKAGESTMTSTAT         PKGSTMSTAT
SYSPROGRAMSTMTSTAT         PGMSTMSTAT
SYSTEM_STATUS_INFO         SYS_STATUS
SYSTEM_STATUS_INFO_BASIC SYS_STAT_B
SYSTEM_VALUE_INFO         SYSVALINFO
TELNET_SERVER_ATTRIBUTES TN_ATTR
TIME_PROTOCOL_INFO         TIME_PROTO
TRACKED_JOB_QUEUES         TRACKED_JQ
USER_INDEX_INFO                 USRIDX_INF
USER_INFO                 USER_INFO
USER_INFO_BASIC                 USER_INFOB
USER_SPACE_INFO                 USRSPC_INF
USER_STORAGE                 USER_STG
WATCH_INFO                 WATCH_INFO
WORKLOAD_GROUP_INFO         WLG_INFO

Additional SYS files.  That might come in handy.  Again lots and lots of information readily available.

SYSCAT1
SYSCAT2
SYSCHARSET
SYSCHKCST
SYSCHRSET1
SYSCHRSET2
SYSCOLAUTH
SYSCOLUMNS
SYSCOLUMN2
SYSCOL2SSN
SYSCONTROL
SYSCST
SYSCSTAT
SYSCSTCOL
SYSCSTDEP
SYSCTRLDEP
SYSDISKS
SYSFEATURE
SYSFIELDS
SYSFILES
SYSFUNCS
SYSHIST
SYSINDEXES
SYSIXADV
SYSIXADVIX
SYSIXSTAT
SYSJARCONT
SYSJAROBJ
SYSKEYCST
SYSKEYS
SYSLANGS
SYSLIMPO_A
SYSLIMPO_B
SYSLIMTBL
SYSMQTSTAT
SYSPACKAGE
SYSPARMS
SYSPDISK
SYSPERIODS
SYSPGSTAT
SYSPIDISK
SYSPINDEX
SYSPISTAT
SYSPKAUTH
SYSPKSTAT
SYSPMQT
SYSPROCS
SYSPSTAT
SYSREFCST
SYSROUTDEP
SYSROUTINE
SYSROUTINP
SYSRTNAUTH
SYSRTNDEP
SYSSCHAUTH
SYSSCHEMAS
SYSSEQ
SYSSEQAUTH
SYSSEQOBJ
SYSTABAUTH
SYSTABDEP
SYSTABLES
SYSTISTAT
SYSTMPSTG
SYSTRIGCOL
SYSTRIGDEP
SYSTRIGGER
SYSTRIGUPD
SYSTSTAT
SYSTXTCLMN
SYSTXTCNFG
SYSTXTCOLI
SYSTXTDFLT
SYSTXTINDI
SYSTXTINDX
SYSTXTLBLS
SYSTXTSRHS
SYSTXTSRVR
SYSTYPES
SYSUDTAUTH
SYSVARAUTH
SYSVARDEP
SYSVARS
SYSVIEWDEP
SYSVIEWS
SYSXSRAUTH

Monday, September 26, 2022

Reading my Journal

All,

Ever had a situation where data was being erroneously written, or simply disappeared and you had no idea why.  For anyone with a journaled database on the IBMi this will be an all to familiar issue.  A commitment control boundary may undo your updates, or the data simply doesn't look right as it is being updated elsewhere in your call stack, far away from the code area you are presently maintaining.

When this happens to me I usually like to review the entries in the journal and analyse the database I/O data to get a feel for what is happening.

So how do I go about doing this.....

Firstly I would identify the journal that is being used.  This may change from time to time so it is best to check each and every time you perform these steps.  Do a WRKOBJ for the library and main file you are trying to track entries for.

WRKOBJ OBJ(DATALIB/TARGETFILE)

Take option 8=Display Description and page down three times or do a DSPFD for the given file and search of the Journal name from the output.



Now you know the journal.  Perform the action(s) you wish to monitor for and note the times.  (start and Finish)

You can then extract this data from the journal using the DSPJRN command.  Below is an example of isolating the entries in a particular environment.  I have gone for all files in library MSLTSTDTA in this example but there are plenty of filtering options.  I then chose to output the resultant data into a file so that I could query it more easily.

DSPJRN JRN(TB_JRN/JRN0103000) FILE((DATALIB/*ALL)) FROMTIME(280222 074200) TOTIME(280222 074300) OUTPUT(*OUTFILE) OUTFILE(LDARE/LD_JOURNAL).  The OUTFILE keyword is optional but will help if you want to perform some queries are a saved dataset.

Please note that the date and time parameters are particularly important as this can take a while to extract otherwise.  Just this small sample took a few seconds interactively.  If querying for more, you might want to consider submitting the task.

The resultant data file is formatted as follows:-

 


The main item to look for is the 'Entry Type'.  These codes identify when data is inserted, updated or deleted from the database table as well as if it is rolled back.

The table below are the main ones that I refer to.  All of these are for Journal Code 'R' - Operation on Specific Record. For a full list of the items covered by journaling see this link

https://www.ibm.com/docs/en/i/7.2?topic=information-journal-code-descriptions

 

Entry Code

Description

BR

Before-image of record updated for rollback operation

DR

Record deleted for rollback operation

PT

Record added to a physical file member. If the file is set up to reuse deleted records, then you may receive either a PT or PX journal entry for the change

PX

Record added directly by RRN (relative record number) to a physical file member. If the file is set up to reuse deleted records, then you may receive either a PT or PX journal entry for the change

UB

Before-image of a record that is updated in the physical file member (this entry is present only if IMAGES(*BOTH) is specified on the STRJRNPF command)

UP

After-image of a record that is updated in the physical file member

UR

After-image of a record that is updated for rollback information

 For a full list of journaling codes you can use this link within the IBM i documentation.

 https://www.ibm.com/docs/en/i/7.2?topic=information-all-journal-entries-by-code-type