Showing posts with label SQL. Show all posts
Showing posts with label SQL. Show all posts

Wednesday, March 6, 2024

Exploring 2E messages via SQL (Update)

Last year I made a small post about querying the underlying 2E files for messages, this use case came up at my current contract today and a small improvement can be made.

Here is the original post for some more context.

Lee Dare - Software Development Principles, Plex/2e and everything else in between.: Exploring 2E messages via SQL (leedare-plex2e.blogspot.com)

In addition to the above, I recommend the SQL syntax is improved by catering for 'Mixed Case'

SELECT a.@@MSG, a.MSG, a.TYPOPT, b.SECLVL                     
FROM YMSGDTARFP a                                             
LEFT JOIN YMSGSECRFP b ON a.@@MSG = b.@@MSG                   
WHERE UPPER(a.MSG) like '%KD33%' OR                           
UPPER(a.TYPOPT) LIKE '%KD33%' OR UPPER(b.SECLVL) LIKE '%KD33%'


The actual query above is checking for the message name in the 2E model, the default message text and the second level text associated with the message.

Thanks for reading and as always, interested in any feedback.
Lee.

Friday, June 30, 2023

Exploring 2E messages via SQL

Bitesized post today.

I needed to query the model to see if a particular function was called in second level message text.  I also wanted to see if it was referenced in the message name or the message text just to make sure I tracked down all usages.

The simple solution was to join the two files from the 2E model and do a little check to see if my string (PGM in this instance) was mentioned.

SELECT a.@@MSG, a.MSG, a.TYPOPT, b.SECLVL FROM YMSGDTARFP a LEFT JOIN YMSGSECRFP b ON a.@@MSG = b.@@MSG WHERE a.MSG like '%MYSTRING%' OR a.TYPOPT LIKE '%MYSTRING%' OR b.SECLVL LIKE '%MYSTRING%'                   

For more on the underlying files in 2E, take a look at these posts from many years ago.

https://leedare-plex2e.blogspot.com/search/label/model%20files

Thanks for reading.
Lee.

p.s. Remember to have your library list pointing to the correct model!

Thursday, May 25, 2023

Object Usage via SQL

Continuing the series on using IBM i SQL for some basic work management tasks.  Today I had a large list of objects that I wanted to understand the date that they were 'last used' and 'days used' count in order to determine if :-

1. The objects could be removed or flagged as obsolete.
2. Prioritise the development/testing based on high volume activity.

Usually, I would use the Work with Object (WRKOBJ) command and work through them one by one capturing the data in excel or notepad manually.  Another option is to build a list of objects on the system using the OUTFILE keyword from the DSPOBJD command and then query these via SQL or Query/400 or simply by using YWRKF our trusted friend in the 2E world.

Today however,  I had a list of around 50 or so objects and as we have separate development and production machines (the latter, I have no access to via segregation of duties policies that I support BTW), I felt it was unfair to ask a colleague to send me 50 screen prints, as well as, this being prone to user error with so many commands to execute.

I could have documented the steps for the traditional method above but this isn't really repeatable for my colleagues nor is it enjoyable.  Therefore, I decide to write an SQL (or four) as it turns out to get me the data and leave us with a template for the future.  There is a bonus 5th one for us 2Er's

There is a bump in the road though.  Isn't there always aye!

Even though IBM have made huge strides with the availability of data via SQL in recent releases.  A simple view for the OBJECT_STATISTICS does not exist.  There is a table function which will get you the data but is precluded on obtaining data for an entire library or a subset based on object type. 

Here is an example straight from the IBM documentation.


When I applied this to my test library, I didn't want the timestamp but a date only and I wanted a subset of the fields available.

This is quite a simple modification to the IBM example, we just do a substring at take the first 10 characters and give the field as nice name.  Note also the replacement of the asterisk (*) for the select to exact fields.

SELECT OBJNAME, DAYS_00001, 

SUBSTR(VARCHAR(LAST_00001), 1, 10) AS LAST_USED

 FROM table                                    

(object_statistics('MYLIB', '*ALL'))   


Unfortunately, I hit a hurdle when trying to view the resulting data via YWRKF.  The 2E programs raised an error condition and threw a hissy fit as I had some null dates represented as '-' in the returned data and not the ISO format (0001-01-01) it was expecting.

A quick google later and an a minor adjustment to the SQL to present null as an empty ISO date and I was now able to view the data.  FYI, Query/400 was fine with the data, so this is an extra steps for loyal 2E users/shops like me/us.  There are also other date conversion routines readily available I just chose this method for now....

SELECT OBJNAME, DAYS_00001,                                         

IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED

FROM table                                                        

(object_statistics('MYLIB', '*ALL'))


This is all well and good, but I also wanted to restrict this SQL result to the objects that were of interest.  I achieved this by using a SQL and filter out those of interest.

SELECT OBJNAME, DAYS_00001,                                        

IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED

FROM table                                                        

(object_statistics('MYLIB', '*ALL')) a WHERE a.OBJNAME in      

('MYOBJ01', 'MYOBJ02', 'MYOBJ03', 'MYOBJ04', 'MYOBJ05',            

 'MYOBJ06', 'MYOBJ07', 'MYOBJ08', 'MYOBJ09', 'MYOBJ10')


Whilst this is great it is only a view on the screen.  I didn't want my colleague to have to take screen prints or scrape the screen with tedious cut and paste to build an excel file for me.

CREATE TABLE QTEMP/OBJ_USAGE AS(
SELECT OBJNAME, DAYS_00001,
IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED
FROM table
(object_statistics('MYLIB', '*ALL')) a WHERE a.OBJNAME in
('MYOBJ01', 'MYOBJ02', 'MYOBJ03', 'MYOBJ04', 'MYOBJ05', 'MYOBJ06', 'MYOBJ07', 'MYOBJ08', 'MYOBJ09', 'MYOBJ10')

) WITH DATA


This was great for my purposes, I sent the SQL to a colleague, and they were able to implement it and send me a file back.  Job done.

Hold Tight!! 

This is a 2E blog (mainly) and I have only mentioned YWRKF.  You can do better than that I hear you cry.

The above list of objects can easily be expanded or substituted with a list from a file which you could build somehow.  What if I linked this to a Model List?  That would be cool right!

Whilst this isn't an option for my site due to having separate machines, it might work for you.  If you are like us, you may have send over some data periodically from production and work on the queries from a different angle.  Anyhow, assuming you have model lists and objects you wish to query on the same machine you can embed your SQL in RPG, CL or as a SQL source member and run with RUNSQLSTM etc.

You just need to link the Model list with your library of objects.  See below for one method.

We create an alias of the member we wish to use as our list.

CREATE OR REPLACE ALIAS QTEMP/OBJ_USAGE FOR MYMODEL/YMDLLSTRFP(MYLIST)

Execute the query.  Optionally as above, you can output to a file if you wish.

SELECT OBJNAME, DAYS_00001, IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED
FROM TABLE (OBJECT_STATISTICS('MYLIB', '*ALL')) a LEFT JOIN QTEMP/OBJ_USAGE a ON a.OBJNAME = b.IMPNME WHERE  b.OBJTYP = 'FUN'

Lastly, tidy up after ourselves and drop the temporary ALIAS.

DROP ALIAS QTEMP/OBJ_USAGE


As always, I am sure that there are other ways of solving this problem and I would love to hear about them in the comments.  This is my current 'new method' and will likely change as I notice more and more flaws or need to expand the scope.

Thanks for reading.
Lee.

Thursday, January 20, 2022

SQL is your friend

Today I had a minor task to implement.  The task was to initialise some new data in a file.

Typically, I would write a program as an EXCEXTFUN and do my processing via standard 2E code.  The problem with little data update programs is that unless you manage them well (i.e. remove them, label them so you can ignore them etc), they start to clutter the model.  On the other hand they are handy to have around in case someone needs to do something similar in the future.

However, unless you are a software house, you probably don't care too much that this 'product field' was initialised 8 years ago by xyz user.

That all said and done, my task was to insert a data record and there were some fields that needed to be initialised that contained the business data, and, there was also an AUDIT STAMP for the INSERT.  If you don't mind me saying, it was quite a convoluted one at that too with numerous fields.  This AUDIT STAMP required a User, Date, Time, Job Number and Job Name among other fields.  With 2E all of these are simple to MAP and are available to us via the JOB context.

My task was to implement the data without adding to the setup/fix program maintenance burden and on this occasion we have a few choices.

  1. Write a document explaining to the implementer how I want them to add a record. Either by a YWRKF hack or perhaps a user screen.
  2. Give them a small data file and ask them to CPYF the records into the target file as *ADD.
  3. Write that 2E program I spoke of above and get them to call it.

Or, as I did in this instance.  

    4. Simply give them a line of SQL I wanted them to execute.

Sweet, job done, but how do I initialise the 2E formatted date, or get the current system date and time from the machine and, where do I get the JOB details from that are so easily available within 2E code via the JOB context.

The answer is as follows: SQL has what it calls global variables (Blue) and functions (Red).  It just turns out that some of these are useful for me to update my fields in the AUDIT STAMP.

In order to initialise my fields I just substitute my insert data with the following:-

User Profile = USER
Date = DEC(CURDATE())-19000000
Time = DEC(CURTIME())
Job Number = LEFT(JOB_NAME,6)
Job Name = SUBSTRING(JOB_NAME, LOCATE('/',JOB_NAME,8)+1,10)

You will note that to get the 2E DTE format we simply minus the 19000000.  For the standard TIME and USER fields we just use the variables without any manipulation.

For Job Number we need to extract the data the JOB_NAME variable.  Usually this is formatted as 123456/LDARE/QPADEV0001 for example.  However, for people with longer names it could be 654321/PLONGNAME/EOD, also the job running the task may not be an interactive screen if it was submitted for example.

This means that to get the job name data from the JOB_NAME SQL global variable we need to call a little function that locates and substrings all the data after the 2nd '/'.  Given that the JOB_NUMBER is always 6 long I've just arbitrarily started from the 8th character.  I could have embedded more LOCATE calls to replace this hardcoded value, but as always, time was against me.

Hopefully, in the future if you need to update a table via SQL, but were put off by now knowing how to get at some of these global variables, the above will stick in your mind.

Thanks for reading. 

Lee.