The Plex/2e Emporium by Lee Dare
A blog primarily devoted to CA 2e (Synon) and CA Plex. Serving the Plex2E community with knowledge sharing for nearly two decades.
Sunday, March 23, 2025
Google was hiding the site
Thursday, April 4, 2024
Did you know Synon 2E has aFull Screen Mode?

If I remember rightly, the command key used to bring up a window and you could chose between Basic (might have been intermediate) and Advanced mode. This has long since been removed from the o/s and the command key merely acts as a toggle nowadays. The reward of course was the ability to see more of your processes and interactive sessions.
Here is my current 'Pro' setup lol.

A lesser-known scenario and also a bit harder to remember all the options, is the ability to extend the records shown when viewing a model list.
So how do you set these values?
From the main Services Menu take option 11 for Edit model profile (YEDTMDLPRF) or as the menu implies, execute the command. Then set the options accordingly.
Sunday, March 17, 2024
They say a picture is a 1000 words!
Anyhow, a colleague was talking about tag clouds the other day and it got me thinking.
"What would this blogs labels look like in a tag cloud?"
So......... I popped online and googled for a tag cloud generator and discovered this site that looked pretty good.
https://simplewordcloud.com/ or you can click here: Simple Word Cloud Generator
I excitedly 'cut n pasted' it into the input field on the website and clicked the button.
Wednesday, March 6, 2024
Understanding CONstant usages in 2e.
As a side, IMHO there should be next to no CON usages in a well architected model apart from, values like *BLANK, 0.00 and some tolerance towards values like 12 (months), 52 (weeks), 365 (days), 100 (percentage). I'd even accept 1,2,3, 9999999 etc.
I draw the line at meaningful literals that could have been scoped conditions (CND) or database values.
My main reason for this is that there isn't a way (via the 2E tool interface) to check for usages of CON values. Obviously, we can scan the generated source, but then what about all the genuine field names or comments.
Back with 2E! When generating the source, it somehow lets the generator knows about them and adds them into the code (inline) or as part of a data structure. Take a look at the bottom of a source listing if you don't believe me.
Here is a sample function using some constants. Note the 'Create Flatfile' call is passing in 'FLAT FILE CoNsTANT' as CON.
Followed by the structure of CON values at the bottom. Note: Sometimes the CON is generated 'inline' at the point of the MOVE.

It looks like we have some constants referred to in field ELMTTL ** (Element Title - highlighted RED). This could be a good start but there are some obvious limitations. Whilst it looks like it covers basic *MOVE and *CONCAT field assignments, when we have hardcoded values going into a function call (PURPLE), the CON value isn't reflected in the ELMTTL field.
However, it looks like 2E maintains a surrogate reference for each unique CON value used in the AD regardless of AD syntax used. (GREEN)
So, all that is left for us to do is to identify the CON values we want to analyse, work out what functions they are linked to via the AD code and some pretty cool impact analysis is before us.
Here is a sample SQL retrieving the many different usages of the term 'constant'. As CON is only ever associated with field @@SUB3, the query is quite straight forward.
Ensuring you have the correct library list.
This query returns unique records for each of the functions using CON context with the word 'constant', the UPPER() function ensures we don't miss any based-on case differences. The rest of the query does a basic join to return some additional fields from YMDLOBJRFP (This is the *ALLOBJ or *A model list). You can add to this query or tidy up names and formatting however you like.
Thanks for reading.
Lee.
Tuesday, March 5, 2024
Exploring 2E messages via SQL (Update)
Here is the original post for some more context.
Lee Dare - Software Development Principles, Plex/2e and everything else in between.: Exploring 2E messages via SQL (leedare-plex2e.blogspot.com)
In addition to the above, I recommend the SQL syntax is improved by catering for 'Mixed Case'
SELECT a.@@MSG, a.MSG, a.TYPOPT, b.SECLVL
FROM YMSGDTARFP a
LEFT JOIN YMSGSECRFP b ON a.@@MSG = b.@@MSG
WHERE UPPER(a.MSG) like '%KD33%' OR
UPPER(a.TYPOPT) LIKE '%KD33%' OR UPPER(b.SECLVL) LIKE '%KD33%'
The actual query above is checking for the message name in the 2E model, the default message text and the second level text associated with the message.
Thanks for reading and as always, interested in any feedback.
Saturday, December 2, 2023
It's (ELeM)entary, My Dear Watson
I've used this trick a few times so I thought I should share it on the blog before I get run over by a bus.
Often we (developers) are asked to produce or consume data files for integrations with 3rd party systems. Even with the emergence of web services there are still many many times where the preferred method of interchange is file based, typically these are bulk (high volume) transactional exchanges. Think payments between banks etc
For the systems I've worked on we have solved this with .csv delimited inbound and outbound files. We have also utilised fixed width files, the majority having old school multi-format layouts with a traditional header, detail and footer structure which has some advantages over .csv.
For now, I will also ignore that we could have also provided XML or JSON payloads as part of a real-time integration.
As you can see, there are numerous ways to skin this cat. However, this is a 2E and Plex blog so for today, let us concentrate on:-
- How you might build and consume files into a 2E based application.
- How can we create a multi-format flat file using purely Synon logic only?
Let's fire up a 5250 session and explore an unsung feature in 2E that helps remove the complexity of building these flat files.
What are we going to build?
An extract file with a header record, 10 detailed transactional records and a footer which denotes both EOF as well as have a total for basic validation. In the real-word this may include hash total etc
An example of what this file data might look like is below.
Back in 2E, first define a file with one Known By (Numeric 9.0) should be suffice for most integrations and a second field (Has relation) of 'Flatfile Data' - Type TXT and length of 500 (or whatever length works for your environment). RP4 has better field limits than RPG.
I've called my file FIXFILP.
Now we need to build the data record, usually you would start concatenating the data values together to meet the design specification of the receiving system and handle those pesky FILLER57 fields etc that always appear from somewhere.
This involves dozens (if not hundreds) of lines of code to concatenate the data. The resulting action diagram is often difficult to understand and cumbersome to maintain.
What if there was an easier way to build up the data with the ability to easily reorder the fields and cater for changing field lengths. Well there is, using a neat unsung feature of 2E arrays.
2E keeps track of the last record to be added, changed or read in an array, a sort of cursor I guess. This is available in action diagram logic hidden in the ELM context. (ELeMent).
The only 'built in' function that can use the ELM context is a *CVTVAR.
First create an array with the data fields you would like to appear in the dataset, this can be header, detail 1, detail 2, footer etc. It doesn't really matter for a flat file process. To keep it nice and simple I have made up some generic fields names with various data types.
I've keyed these arrays based on the Record type of HDR, DTL and FTR. You can do how best suits your use case. All the arrays are set with a 'Numer of Elements' of 1 record as I don't need to use the in the traditional sense. I just need a pointer to an ELM in memory.
All we then do is call a CRTOBJ over the array to populate the data. Once in the array, we can use the *CVTVAR to populate a flat file field. 2E handles all the different data types and spits out a well formatted string which you can write to the database/extract file etc

But we are not done. I've ready other blogs that talk about ELM and they do a pretty good job of explaining the outbound example above. But not many people realise that depending on whether you are using the ELM context as Input or Output, is the equivalent of constructing or deconstructing the data. So yes, this method can be used to unpack flat files also. :-)
As long as in the receiving function you have created a shell array record. You can use ELM to move the data into the array and then standard RTVOBJ functionality to retrieve the record in its deconstructed form.

An example below of a couple of screens that I rushed together showing the data string and the subsequent fields.
Thanks for reading.
Lee.
Thursday, September 28, 2023
This doesn't get old.
Lee.
Friday, June 30, 2023
Exploring 2E messages via SQL
I needed to query the model to see if a particular function was called in second level message text. I also wanted to see if it was referenced in the message name or the message text just to make sure I tracked down all usages.
The simple solution was to join the two files from the 2E model and do a little check to see if my string (PGM in this instance) was mentioned.
For more on the underlying files in 2E, take a look at these posts from many years ago.
https://leedare-plex2e.blogspot.com/search/label/model%20files
Thanks for reading.
Lee.
p.s. Remember to have your library list pointing to the correct model!
Wednesday, May 24, 2023
Object Usage via SQL
1. The objects could be removed or flagged as obsolete.
Today however, I had a list of around 50 or so objects and as we have separate development and production machines (the latter, I have no access to via segregation of duties policies that I support BTW), I felt it was unfair to ask a colleague to send me 50 screen prints, as well as, this being prone to user error with so many commands to execute.
I could have documented the steps for the traditional method above but this isn't really repeatable for my colleagues nor is it enjoyable. Therefore, I decide to write an SQL (or four) as it turns out to get me the data and leave us with a template for the future. There is a bonus 5th one for us 2Er's
There is a bump in the road though. Isn't there always aye!
Even though IBM have made huge strides with the availability of data via SQL in recent releases. A simple view for the OBJECT_STATISTICS does not exist. There is a table function which will get you the data but is precluded on obtaining data for an entire library or a subset based on object type.
Here is an example straight from the IBM documentation.
When I applied this to my test library, I didn't want the timestamp but a date only and I wanted a subset of the fields available.
This is quite a simple modification to the IBM example, we just do a substring at take the first 10 characters and give the field as nice name. Note also the replacement of the asterisk (*) for the select to exact fields.
SELECT OBJNAME, DAYS_00001,
SUBSTR(VARCHAR(LAST_00001), 1, 10) AS LAST_USED
FROM table
(object_statistics('MYLIB', '*ALL'))
Unfortunately, I hit a hurdle when trying to view the resulting data via YWRKF. The 2E programs raised an error condition and threw a hissy fit as I had some null dates represented as '-' in the returned data and not the ISO format (0001-01-01) it was expecting.
A quick google later and an a minor adjustment to the SQL to present null as an empty ISO date and I was now able to view the data. FYI, Query/400 was fine with the data, so this is an extra steps for loyal 2E users/shops like me/us. There are also other date conversion routines readily available I just chose this method for now....
SELECT OBJNAME, DAYS_00001,
IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED
FROM table
(object_statistics('MYLIB', '*ALL'))
This is all well and good, but I also wanted to restrict this SQL result to the objects that were of interest. I achieved this by using a SQL and filter out those of interest.
SELECT OBJNAME, DAYS_00001,
IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED
FROM table
(object_statistics('MYLIB', '*ALL')) a WHERE a.OBJNAME in
('MYOBJ01', 'MYOBJ02', 'MYOBJ03', 'MYOBJ04', 'MYOBJ05',
'MYOBJ06', 'MYOBJ07', 'MYOBJ08', 'MYOBJ09', 'MYOBJ10')
Whilst this is great it is only a view on the screen. I didn't want my colleague to have to take screen prints or scrape the screen with tedious cut and paste to build an excel file for me.
CREATE TABLE QTEMP/OBJ_USAGE AS(
SELECT OBJNAME, DAYS_00001,
IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED
FROM table
(object_statistics('MYLIB', '*ALL')) a WHERE a.OBJNAME in
('MYOBJ01', 'MYOBJ02', 'MYOBJ03', 'MYOBJ04', 'MYOBJ05', 'MYOBJ06', 'MYOBJ07', 'MYOBJ08', 'MYOBJ09', 'MYOBJ10')
) WITH DATA
This was great for my purposes, I sent the SQL to a colleague, and they were able to implement it and send me a file back. Job done.
Hold Tight!!
This is a 2E blog (mainly) and I have only mentioned YWRKF. You can do better than that I hear you cry.
The above list of objects can easily be expanded or substituted with a list from a file which you could build somehow. What if I linked this to a Model List? That would be cool right!
Whilst this isn't an option for my site due to having separate machines, it might work for you. If you are like us, you may have send over some data periodically from production and work on the queries from a different angle. Anyhow, assuming you have model lists and objects you wish to query on the same machine you can embed your SQL in RPG, CL or as a SQL source member and run with RUNSQLSTM etc.
You just need to link the Model list with your library of objects. See below for one method.
We create an alias of the member we wish to use as our list.
CREATE OR REPLACE ALIAS QTEMP/OBJ_USAGE FOR MYMODEL/YMDLLSTRFP(MYLIST)
Execute the query. Optionally as above, you can output to a file if you wish.
SELECT OBJNAME, DAYS_00001, IFNULL(SUBSTR(VARCHAR(LAST_00001), 1, 10),'0001-01-01') AS LAST_USED
FROM TABLE (OBJECT_STATISTICS('MYLIB', '*ALL')) a LEFT JOIN QTEMP/OBJ_USAGE a ON a.OBJNAME = b.IMPNME WHERE b.OBJTYP = 'FUN'
Lastly, tidy up after ourselves and drop the temporary ALIAS.
DROP ALIAS QTEMP/OBJ_USAGE
As always, I am sure that there are other ways of solving this problem and I would love to hear about them in the comments. This is my current 'new method' and will likely change as I notice more and more flaws or need to expand the scope.
Thanks for reading.
Lee.
Wednesday, April 12, 2023
A few more little SQL's for IBM i
Thanks for reading.
Lee.
This is a summary of the files available for SQL. We've only just scratched the surface with these. We are only limited by our imagination with what tasks we can automate, improve etc.