Showing posts with label Plex. Show all posts
Showing posts with label Plex. Show all posts

Sunday, October 30, 2011

A little look at Plex-XML


Many of you may have had read about the Plex-XML framework from Allabout Gmbh.  There have been mentions online (LinkedIn - Plex group or PlexWiki for example).  The guys were at the conference in 2007 and this caused quite a stir in the Plex community.  Well the technology is still going strong.

I decided that I’d take a look.  This is the great thing about having a role which is focused on R&D and application architecture, you get to play with things, this is of course allowed by my employer www.sasit.co.nz who specialise in mission critical systems hosting and development (Harmless plug) so take a  look at the web page.  The pressure comes when you need to back up your decisions, so interesting times.....

My investigation was a case of downloading the relevant stuff from the guys as well as Eclipse (IDE), MySQL (Database), Java JRE (Runtime), Senchas ExtJS (Javascript control library), Tomcat (Web Server) and some drivers etc.  A full list of requirements and how to put this all together is on the Plex-XML website (Wiki).

So after some great support from the guys at Allabout I did end up with a fully working tutorial application.  

A screenshot from the tutorial.
But that’s cheating though isn’t it.  All the hard work was done for me.  I needed to know what all the heaving lifting requirements are in order to evaluate further.  So I then decided to replicate this for one of my own entities from ‘Plex entity entry’ right through to all the configuration files and framework parameters. 

I chose a very basic entity with a numeric key, short description, date, long description (notes) and a status and “NO!!!!”, I am not going to describe every step here as:-

1.       It’s not the scope of this blog post.
2.       It’s already described in the Plex-XML wiki

So I followed the instructions once more and I made quite a few small mistakes with my naming conventions etc so be careful when doing these.  But, I did manage to get it all working once I knew what to change and create.

My new entity :-)

What I liked was the out of the box stuff.  I never coded the date picker or the x of xxx characters remaining stuff.  It just happened!!! J

As I have already alluded to, the pain for me was the configuration files etc and I am chuffed to bits to hear of another company from Germany who have been promoting their utility to help them with the Plex-XML framework.  Te@mconsult.  Check out this link. These two combined make all the difference and I for one can’t wait until the public beta commences.

So ‘all in all’ this was a great experience.  I strongly recommend that everyone takes a look at many of the 3rd party patterns that are available for CA Plex.  Sure it’s great to be a developer and cut your own, but some of these frameworks and patterns from some of the major players have been on the go for a few years now and that effort vs price (if applicable) really is a “no brainer”.

Thanks for reading.
Lee.

Monday, October 17, 2011

Beginners Guide to the CA Plex Model API using C# .NET




Firstly thank you to Rob Layzell for inspiring this blog.   If it wasn’t for his lab exercises I would have done this Model API exercise and subsequent blog using CA Plex and component import.  Now, whilst this is probably a cool thing using a code generation tool to interrogate and update the code generation tool repository (I am sure there must be a word in the OO world to describe this. JJJ).  In the back of my mind I was also acutely aware that release 7.0 is entering beta testing and the much anticipated C#.NET (WPF) client will soon be upon us.  So there isn’t a better opportunity for me to sharpen the old grey matter with some learning than this. 

I chose to continue with the C# coding for the simple reason that it would be worthwhile me understanding what .NET developers need to do as I will ultimately be looking to target .NET for my future applications.  And, of course, enable me an opportunity to learn Visual Studio, some pigeon C# and understand the inner workings of this environment.  All worthwhile skills for when the 4GL needs a little investigation, tweaking or debuggingJ.

We have had C# for server code generation since release 6.0.  Release 6.1 brought WCF to strengthen this offering.  With 7.0 we will have the client side resolved too with the promise of XAML and WPF.  The web service import is another tool that simplifies the integration and consumption of web services, so, all in all, good times ahead.

So back to Rob's unintended blog influence.  His tutorials for .NET used numerous examples of Visual Studio and C#.NET.  One in particular that took my fancy was Lab 10 and related to consuming the PlexAPI COM component and creating triples in the local model.  The ModelAPI is described as being useful for pattern designers in the online help in order for them to automate much of the configuration and setup for using their patterns.  I also see it as an opportunity to ensure that another level of automation is included in the tool.

Anyhow, seasoned CA Plex developers know we already have metacode to influence the generated code.  We also have patterns and inheritance to ensure consistent design or as I call it, conformity to the application architecture.  There are still of course plenty of areas with regard to modelling that a developer could easily get wrong when entering the triples.

In my last blog I alluded to a small entity creation tool that I am creating (UPDATE – This has blown into a full blown too now).  This is to ensure that the fields and files that I enter into the application model conform to our standards i.e. naming, inheritance, default field lengths, narrative being entered and labels etc.  As we are a new CA Plex (long time 2E shop) I am still working on these standards and experimenting but I will share the code and a sample model on request (note that this does currently have a degree of hardcoding at the moment but more than enough detail for people to get acquainted).

For those that are interested in taking this further and perhaps helping me improve the utility, just drop me your email in the comments section. J

For me this (ModelAPI) is too quickly overlooked but if you are serious about CA Plex and modelling then these are exactly the type of utilities (Add-Ins) that one should be considering.  These will further improve your team’s productivity as it is pretty quick and easy to tailor for your environment once you get your head around the API and the underlying model structure.  Just take a look at the StellaTools from George.

I recommend any CA Plex developer to take some time to understand what you can achieve with this feature.  This is quite an in depth subject so I feel that this will eventually become a four part series, so, if you are keen....

Part II – ‘Model API and the model repository’ will cover the underlying architecture of the model.

Part III – ‘Key Model API commands’ will go into detail about some of the core commands that you will need to understand in order to develop against the COM API and sow some of the theory covered in part II together.

Part IV – ‘Some DotNet Tips and Tricks’ will show some programming tips and I hope should be enough to inspire a few of you to ‘RTFM’ with regard to the Model API.

All feedback is appreciated.  Until then.


Thanks for reading.
Lee.



Monday, October 3, 2011

RTFM

“RTFM!” (Read the effing manual) - that was the polite version.

That’s what I was told when I first started programming.  Not every time (obviously), but on that odd occasion where the question was a repeatedly asked one (by me) or that time the more senior programmer didn't eat his oats in the morning.

To this day I still hear this from the more seasoned developers around me.  But I must say that this phrase (I believe) is in steep decline.  Could it be said that my generation of developers may be the last to utter this immortal programming phrase.  You see (or should I say hear) more and more “RTFM” is being replaced by “Just Google it.”

Today I was working with CA Plex 6.1’s Model API.  I was trying to automate a common task that I perform in my model and ultimately cut down on some keystrokes, mistakes and most importantly, ensure ‘THAT I’ conform to the standards ‘THAT I’ have decided and (hopefully) along the way help the other developers in the team J.

I was working from some excellent examples from CA regarding the Model API and thought I’d try a small example using C# WinForms and Visual Studio 2008.  All I wanted to do was create a new entity with appropriate naming standards, fields and keys all with correct inheritance from our patterns etc.  So breaking the problem domain down into small manageable chunks (isn’t that now known as an agile sprint), I decided to get the Entity Prefix (I like these), Entity Name and Entity Inheritance easily created.  The rest will come as the AddIn matures and I implement all the ideas and wizardry in the roadmap. 

As I have said I am creating this pioneering utility using the Model API (Version 3.0) in Plex 6.1 and I will deploy the final program as an AddIn. I have also used this as an excuse to brush up a little on Visual Studio and C# as we are going to be heavily reliant on this IDE once Plex 7.0 comes out and we are all doing DotNet stuff galore. 

I will blog a little more in the future about how I did this (It is only basic and I am still learning) but I have to say that the Model API 3.0 is quite powerful, just ask George Jeffcock about his Stella Tools. 

Here is a sneak peek at the current screen. 



As you can see it is about as simple as one can get, but, without much knowledge of C# I quickly became stuck, stranded, frustrated, challenged and peeeeeeed off.    Anyhow, forget asking Jeeves or posting on some technical forum.  For older scholars picking up the manual or downloading that e-book PDF you had been considering is also a waste of time and effort. 

If you want to code nowadays, it appears you just “Google it”. 

I am not sure whether this is like
1.       4GL of C# coding
2.       Cheat sheet
3.       Or simply just good common sense to utilise a million experts rather than one book

.......BUT.......

I got the answers I needed pretty quickly and was able to continue with pilot project in earnest.

I have a great mental roadmap for the utility and only time will tell how it matures.  But for all us elder statesman of the Plex community, I'll borrow terminology from one of Ramon Chen’s key marketing phrases from the early days of Obsydian.

“Stop coding, start Googling.”

Thanks for reading. 
Lee.

Sunday, December 12, 2010

Retro look back part II

Thanks to John Rhodes for the image of an old Obsydian diskette from the 1.0.3 release.  We are now at 6.1 and we have WCF, EJB, Java and .NET generation all being added since.  I really does show that MDD (Model Driven Development) can stand the test of time....

Personally, I see a marked 'upswell' in the desirability of code generation tools or at the least the concept of not writing all the CRUD stuff over and over again.

Time will tell.

Thanks for reading.
Lee.

p.s. Apologies for not keeping the blog up to date.  I have been working fulltime on a green field Plex project here in NZ.  I will try and summarise my findings in a future post.

Thursday, June 24, 2010

Integration, Separation and CA Plex

Plex has always been a valuable tool for ISV’s or systems integrators. With the patterns, multi-platform code generation and dynamic application partitioning good integration and logical separation can be achieved very quickly out of the box.

Numerous technologies are supported to allow Plex to integrate/interoperate with other applications and platforms. See the main ones below.

  • COM Connectors and COM Import
  • ActiveX
  • EJB Connectors
  • .NET WCF Service generation
  • Handcrafted Source Code Support
  • Websydian TransactXML

Plex has always had the capability to separate the business, database and presentation logic aka a MVC pattern should you wish to implement your projects that way.
 
I have become aware of an exciting new development here in New Zealand where a project utilising the skills of ISA Ltd http://www.isa.co.nz/ for Plex consultancy coupled with an industry leading business rules engine from Idiom Software http://www.idiomsoftware.com/ to create a funds management application.

In summary Plex is being used to create the database, the UI and the database Create, Read, Update and Delete (CRUD) logic. Idiom’s code generator is being used to implement the business rules. The synergy between Plex and Idiom is fantastic. Both support Java and .NET (C#).
 
Idioms’ principal focus is on the business rules that automate decision making. Many of us are aware of the effort required by developers to amend business rules in any application, and we know that Plex and 2E with their model based development paradigms makes this very easy. But it is still a developer task that involves changes inside the core application - that is, it is still a techie task that the business perceives to be 'over the wall'.
 
Now the fusion of these two technologies allows a developer to build the core application and the business consultant to build the key business rules.
 
Idiom offers something new - an ability for the business (or at least a business analyst/consultant) to take direct ownership and custody of this decision making logic. The IT developers lose a whole bunch of complexity and responsibility that significantly improves their productivity, while the business willingly takes on this load. The bottom line is that the business logic is now being managed as content within the IT managed core application. IT keeps the ultimate control and delegates out selected components for business control. The business delivers fully tested decision models to be called dynamically by the application.
 
It is a win-win.

At last, is this finally fulfillment of the original promise of CASE technology!!!!! IT developers AND business users creating the software. I am aware that the development of the application and rules are being done in parallel thus reducing the overall timeline to deliver the project.

However, enough from me. The guys at Idiom have decided to blog about their experiences using Plex as the application builder and Idiom for rules execution. The results make excellent reading and you can track their progress as they move through their project.

Part One

Part Two

Part Three
 
Until next time.
 
Thanks for reading.
Lee.

Thursday, June 17, 2010

Webcast Bonanza

Just a quick one today.

Many of you who use Plex will have heard of the guys at http://www.websydian.com/.

They have products like WebsydianDeveloper Patterns, TransacXML and in partnership with ADC Austin pioneered the WebsydianWebClient.  They also have other tools for 2E too.

Now they have done it again with an update to the Websydian patterns.

Join Anne-Marie and the team for this much anticiated technology preview/demonstration of the integration of Websydian patterns and the ExtJS library.  http://www.extjs.com/ or should I say www.sencha.com/ as they are now known (14th June 2010) - http://www.sencha.com/blog/2010/06/14/ext-js-jqtouch-raphael-sencha/ for more information on that merger.

To join this webcast follow the instructions found at this link.

http://www.websydian.com/wsyweb20/site/websydian

See you all there.

Thanks for reading.
Lee.

Friday, June 4, 2010

Anyone got a ROADMAP I can use?

CA Plex & CA 2E roadmaps published.

Great news!!! The latest roadmaps for CA 2E and CA Plex (dated May 2010) have been published on the CA website. The links are below:-

Note:- You may need to login into CA support to access these and if you haven’t got a support account it is relatively easy to register.



The key features as I see it are:-

CA Plex

  • Further research and development for the .NET WPF and XBAP client technologies recently demonstrated at the May Mainframe Madness event and Ft Lauderdale in late 2009.
  • Unicode support for IBM i DB2 database
  • Improved JavaBeans support for the Java client generator
  • Final version of the Code Library packaging wizard
  • Continued enhancements to the base product and focus on improvements for each of the main generators.
CA 2E

  • Improved Web Services support. I see this as meaning better WSDL naming and Result sets.
  • More functions over *Arrays
  • Logical deletion of functions in the model
  • Web Option and base tool improvements.

The main message across both of these roadmaps was ‘Enhancement Requests’ and CA’s commitment to be lead by its customer requirements. We play our part by getting them created in the first instance. So if there is something you want to see in the product then get creating.

I would have preferred to have seen more firm commitments and dates but I guess this is a case of working through the detail and then publishing to the community.  So I keenly wait to hear more soon.

Thanks for reading.
Lee.

Tuesday, May 4, 2010

My top five enhancements for CA Plex.

We all know by now that CA Plex 6.5 is in the R&D phase and some technical previews have occurred with more planned. If you are short of knowledge in this area then take a look at Rob’s presentation at the CA Mainframe Madness Week. (www.ca.com/mmm).

I understand that Rob’s presentation has all anyone would need to convince your management about deploying your applications using .NET technologies and code generated by Plex. I have been snooping around (hassling Bill  :-)  ) and I am told that there are no less that four demonstrations for us to enjoy.

If it really floats your boat you can also look at the older presentations from Florida 2009 on the Plex wiki (see links on the side of my blog) which touch on Plex and 2E futures.

I though that I’d put it out there what my top 5 enhancements for the products could be and I tackle Plex first. I will tackle 2E another time.

Disclaimer: These are my opinions and I have been proven wrong on more than one occasion in my life. Just ask my wife (first wife) if you require additional information.

In no particular order.

One: New UI for the tool. I know that the tool is first class and that it enables hundreds of companies around the world to build first class enterprise systems. I feel that a little spruce up of the IDE Look ‘n’ Feel will go a long way to deter the doubters. A Ribbon bar here, a glass effect there and 64x64 icon in that corner, you get my drift, will go a long way. IMHO.

Two: Native XML Input/output. Pretty self-explanatory. I want to be able to choose how my service functions are called/exposed and how the data is passed.

Three: More native Action Diagram commands that replace the Technology API’s. i.e. Upper, Trim etc etc. I would prefer to have these as native action diagram commands rather than calls. I would like Plex to handle the new code when a new generator is released rather than me.  This will also help with my model management, especially when generating for multiple platforms which is one of the tools key differentiators.

Four: Visual Studio 2010. I want Plex to be aligned to the most recent releases of Microsoft’s leading IDE within 6-12 months. When I say that C++ generation needs 2005 for compliation I get shakes of the head from the MS 3GL guys at my shop.

Five: Patterns. The ultimate flexibility but somehow we seem to struggle to exchange these. Would be great if CA lead an initiative for pattern exchange and contribute by providing additional business solutions via a website on a regular basis.

Six: ‘Hey. You said five on your list.’ I know I did!!!! However.  I do like the idea of overhauling enhancing the diagramming and help editors with perhaps an automatic creation of diagrams from the model and easier editing and additional deployment options for generated help.

That’s my list for Plex. What do you want? Perhaps I can compose a blog of all your replies.

Thanks for reading.
Lee.

Thursday, April 29, 2010

CA show strong commitment to Plex and 2E!

I blogged recently about the May Mainframe Madness and in this I referred to the realignment occurring within CA. 

I am happy to report that CA's plans for 2E and Plex are not affected and that new roadmaps (as I predicted) are imminent and will be published on CA support online in the coming weeks.

Extract from a recent communication for which I was a recipient.

"Following the email regarding the recent realignment at CA, please be advised that, for legal reasons, we are not in a position to provide further detailed information until after the end of May. We reaffirm that CA is committed to continuing to develop and support CA Plex and CA 2E; this realignment had no effect on our plans to continue these products forward.

We are in the process of publishing updated roadmap documents for CA Plex and CA 2E that will be available on CA Support Online in the coming weeks."

The key words above are reaffirm, committed, develop, support and no effect.  Being a 'Scrabbler' online, some of these words are worth more than others, but, all together they send a strong message.

All the legal stuff will be that there is an alignment process that needs to be adhered too.  Any product news positive or otherwise will be linked to the alignment process which is ongoing.  Once this is resolved I suggest it will be business as usual or BAU for us Three Letter Acronym freaks.

I'm happy. After all I am pushing for CA Plex to partner our CA 2E development at my employer.

Thanks for reading.
Lee.

Wednesday, April 28, 2010

CA Mainframe Madness

Hi,

Most people are aware that CA have decided to trim their staff and products and I must admit I had my heart in my mouth for a second as the Plex and 2E tools are very dear to my heart.

I am pleased to hear that my fears are unfounded.  Although, like most I am eager to hear from my friends at CA regarding product roadmaps for 2010.

However, we might get a little insight for what is next with these great tools at CA's mainframe madness event which is coming to an internet connection near you soon.

If you take a look on the CA Mainframe Madness website www.ca.com/mainframe/may you can see that the presentations will be available in a virtual environment for the whole of May.  There certainly looks like a few interesting presentations and I am so so so so eager to see how Rob Layzell and the team are getting on with the features for Plex 6.5.

There are a total of 23 sessions for us Plex and 2E users (fans) and some are even in Spanish.  You can click on here and view them - Mainframe Madness Sessions - Scroll down to the Modern Development Tools section and read away.

I am looking forward to seeing these. 
So enjoy and,

 "REMEMBER TO REGISTER". 

Thanks for reading.
Lee.

p.s. There are some PC prerequisites that you will need to adhere too so I suggest you do it ahead of time.

Tuesday, November 10, 2009

Enhancement Requests

Well it is almost that time of year again.  You know.  Christmas!!!!   We all sit back and relax, eat turkey (far too much) and discover that port is okay to drink by the bottle after all.

But putting that aside it is also nearly the new year and that will mean the annual CA enhancements voting survey.  So many of you know this annual event and the results are collated and influence the R&D effort. 

Certainly a .NET client for Plex came top last year and this is what we saw at the conference albeit in technology preview mode.

Whilst not everything on the list can get done.  CA tend to try and get the balance right between the list items that rank highly and the market direction of the product as they see it too.  Which is fair I guess.

My call to you all today is not that the enhancements are starting soon and you should al be reaching for your new 2010 Pirelli calendar.  That's for Bill and the team to decide.   However, I am saying that the list is made up of requests that we make to the CA support desk.  i.e. no requests equates to a smaller list of items or last years ones (some of which may still be very valid).

My challenge to the community is to think through some of your preferred enhancements and ensure that you register these via the support desk online so that they make the list for you to vote on.  I am also aware that CA is planning to extend its arm further into the user community with a product advisory type approach involving some of the clients.  I am certainly looking forward to contributing to this.

However, in the meantime, start raising those tickets and remember when you are voting you are voting for your top ten in the order you want them.  Some people have been confused by this in the past..... 

Thanks for reading.
Lee.

Wednesday, February 11, 2009

Its Product Enhancement Time for CA Plex and CA 2e(Synon)

Not many software companies give you the chance to directly influence the strategic direction of a product you use.

The product team at CA are once again asking us to cast our votes for potential enhancements to the CA Plex and CA 2E (Synon) tools.

This recent communication was sent by Bill Hunt to all PLC (Product Line Community) members. If you are not on the list then you are not in the know.

Join.

See details below.

"Hello CA Plex and CA 2E Community,

User feedback, suggestions and ideas are an important element in our development planning efforts. With this in mind, we would like to invite and encourage you to participate in our annual Enhancement Request Priority Voting program. We are launching this program effective now.

As was also the case last year, we ask that you review the attached list of enhancement requests which our team has reviewed and considered worthy of additional research. There is one list for CA Plex, one for CA 2E - choose whichever list(s) are applicable to you.

From these lists, we ask that you submit your “top ten” priority enhancements from this list:
- Voting is done online:
o For CA Plex: https://www.casurveys.com/wsb.dll/156/PLCPLEX-ERJan2009.htm
o For CA 2E: https://www.casurveys.com/wsb.dll/156/PLC2E-ERJan2009.htm
- You must be a registered member of the CA Plex/2E Product Line Community in order to submit a vote. Use your registered e-mail address and password to log into the voting system.
- The online system is the only manner is which votes will be counted.
- CA Employees are not eligible to submit votes.
- Please do not send your votes via e-mail to me, or Daniel Leigh, or any other CA team member – votes submitted this way cannot and will not be counted into the results.
- Each PLC member is allowed one vote each (if you use Plex and 2E you are able to participate in both surveys)
- If you have colleagues using CA Plex or CA 2E, please encourage them to register for the PLC:
o Go to http://causergroups.ca.com
o Click “Join Today”
o Choose “CA Plex/2E Worldwide PLC Global User Community” in the drop down box indicating which user group to join
o Fill in the required contact information
o It only takes a minute and it costs nothing!
- In addition to the actual enhancement request survey, there are some poll questions included online regarding your overall use of the products, your impression of the PLC program and what events you would like to see or would be likely to attend in the future. We ask that you answer these questions as well; this would be helpful to us and very much appreciated.
- The “voting polls” will be opened Monday 9-February and close on Tuesday 31-March. This should give teams ample time to review the lists and make decisions on which items would be more of a priority to help your CA Plex or CA 2E development efforts in the future.

Our participation as a group in this program last year was excellent, and we hope to get at least as much participation as last time. If there are questions please don’t hesitate to contact me, thank you in advance for helping us understand your needs, and for your continued support of CA Plex and CA 2E. "

JUST DO IT!!!!!

Thanks for reading.
Lee.

Tuesday, December 30, 2008

Merry Christmas CA

Well it is that time of year again when we all take a well deserved break (this is subjective I know), send those work clothes to the dry cleaners for a much need overhaul, eat too much, drink even more and come back to work refreshed in the new year with at least one broken new years resolution by the 5th of January.

Christmas is often a busy time of year for many of us and we sometimes also spend the time talking about those that are no longer with us at those family reunions. I presently have some of my relatives visting us and have also had a long time family friend popping in to say "Hi" even though we live 12,000 miles apart.

Anyhow, Christmas is a time of giving (well it is in my world) and CA thankfully think no differently.

If you haven't already seen the news feeds then you may not know that our friends at CA have gone GA on CA Plex 6.1. There are quite a few neat features in this release that are worthy of a mention here, highlights include:-

Model-based SOA support
- WCF Service Generation
More Extensible Development Environment
- Plex API Enhancements and Add-Ins
- Code Library Service Generator Plug-Ins
Windows Vista Support for development
Group Model Update History
Runtime Backwards Compatibility - Easy Upgrade from 6.0
- No re-gen or re-build from r6 to r6.1
Platform Compatibility Updates
- IPv6
- Java SE 6.0, ANT 1.7.0
- SQL Server 2008, Oracle 11g
- Windows Server 2008
- 64-bit Windows

For full details follow some of the following links (Note I am not responsible for external site content):-

http://www.ca.com/us/press/release.aspx?cid=194906

or the product brief

http://www.ca.com/files/ProductBriefs/ca_plex_product_brief.pdf

I am particularly keen on the additions and improvements to the Model API and what this will allow third party vendors to create to support the CA Plex ecosystem.

But then there is more.

CA have also announced that the user voting system polls used successfully in 2007 will be re ran early 2009 to help provide CA product management with user driven requirements for the enhancement of the toolsets (Both 2e and Plex). But remember you are voting in enhancement requests made. In order to get your requests in the list you need to contac CA support and raise a ticket.

But then there is more.



2e 8.5 is already Alpha and going Beta in the new year. Highlights for this release are:-

ILE Service Program support
- New object type to combine modules into service programs
- Supports CA 2E-generated and external modules
Web Services support
- Deploy ILE Service Programs as web services
- Based on IBM i Integrated Web Services server for ILE
Improved Impact Analysis
- Take account of commented-out code
- Voted No. 1 in 2008 Enhancement Request Survey
Better search/positioning facilities in the model
IPv6 compatibility
CA 2E Web Option Environments
- Separates user data from the product libraries

I believe GA is planned for July 2009.

But then there is more.

There are webcasts to be heard in the new year. There are events to attend. There are conferences to consider. There are powerpoints for CA world to be downloaded.

How do I know all this. Well I am a member of the PLC (Product Line Community) and as such I get regular email updates from the product management team at CA.

To join this ever growing list, simply click the link below.



Mery Christmas and a Prosperous (Credit Crunch Recovery) New Year.


Thanks for reading.
Lee.

Saturday, May 31, 2008

The Great 3GL v 4GL debate - Part III

This is part III of a trilogy of articles regarding the usage and evolution of software development languages. Part I can be found here and part II here.

All of these technologies have issues to address. 20 years ago we were all happy with green screens for business applications with centralised platforms, then came client server with Windows and the distributed computing model became mainstream. Then along came the Internet and the return to HTML thin clients and now the evolution once more learns towards Rich/Smart clients.

The irony for me as that I have witnessed many people move on from the 4GL world of the nineties to emerging 3GL (albeit object based) technologies i.e. J2EE (Java) and .NET compatible languages etc.

With the extra layers of complication (some call it abstraction) added due to business usage of the internet I am seeing more and more tools coming onto the market that claim ‘code generation’ capabilities. You only have to look at the OMG’s ever growing list to see that once again people are looking for the holy grail of application creation as projects overrun and costs escalate.

I do see a trend towards total code generation once more. IBM has launched a 4GL called EGL. This looked quite promising and might me worth a look but to me it is not yet as mature as others.

The difference between tools like Plex/2e and this new breed of tools is that the ‘so called’ newer tools generally only cater for the singular environment and often really only create the initial code that requires manual intervention and coding in the generated language. In my mind, these tools have yet to evolve as far down the road as Plex/2e.

Plex and 2e both have their unique selling points.

2E is pretty easy to use and probably has a 3-6 months learning curve for a developer to become very proficient. Quicker with excellent training and in-house support. Software development room 101. Item 3. Always spend decent money getting a guru to help you set up your environment and train the developers. Too often mistakes are made is the early stages of application development. This is especially true when using new tools.

Plex will take longer (12 to 18 months) as it supports inheritance, shipped and customer business patterns, meta coding and many more target development platforms. It really is the Daddy of ARAD (Architected Rapid Application Development), hence the learning curve but the payback after this is judged in weeks, months or even years off a development projects timeline. And with the great pricing of the tool and generators nowadays, it really is an option to help protect you against the constant upskilling costs associated with other technologies.

When you also consider that the tool has localisation, application version partitioning built into the tool. From the single skill set perspective your developers will always remain current. That said, you would always create the optimum patterns and platform level code if some of your developers have the lower level skills.

I have been programming computer systems in Plex and 2e for 16 years and these systems have used the best aspects of these tools and have always been database focused applications.

These have been in Finance and Banking, Debt Management, Mortgage Application and Processing, MIS, Project Management, Time Recording and Environment Management. These were deployed on System I (now IBM Power System with ‘i’ as the operating system (RPG and RPG ILE code), Java, C++ server code all with either C++ or Java (Swing) clients.

With the plans for these tools heading towards .NET C# clients and the C# server code in 6.0 already available. The recent announcement of the WebClient partnership between ADC Austin and Websydian means that the future looks really bright.

Time will tell what will happen and often these battles are not won or lost by the technologies, often they are decided by the marketing budgets.

However, I know what playground I want to play in. And if you need a guru to help you. You should contact me.

Thanks for reading.
Lee.

Monday, May 12, 2008

The Great 3GL v 4GL debate - Part II

This is part II of a trilogy of articles regarding the usage and evolution of software development languages. Part I can be found here.

So what are the benefits or otherwise of using a 3GL over a 4GL and visa versa. For me it certainly depends on all the usual factors that drive any technology decision. Cost of product, support, flexibility, the human factor, tool lifecycle, vendor direction and target platforms being a few that come to mind instantaneously.

The Pro’s of a 3GL

Embedded or mission critical applications like Air Traffic Control systems are generally handcrafted and more suited to a 3GL environment, as are operating systems, 4GL tools themselves (debatable), communications, hardware drivers and generally non database applications. As the developers have access to all the API’s and are that step closer to the CPU, they generally have wider usage opportunities.

Accessibility to wider developer pool. Whilst there are probably thousands of developers for your chosen 4GL, possibly even tens of thousands. These tools simply do not have the numbers associated to mainstream development languages and IDE’s. There is an estimated 4 to 5 million developers following the evolution of Java and no doubt Microsoft can boast even more for its most popular products. That said, of course, this also means that it is also harder to find a guru within that skills ocean, not to mention, filtering out those who have spent 15 minutes in the IDE and now claim some form of exposure on their curriculum vitae.

3GL’s are quicker to react to emerging markets and development trends. Generally the suppliers of these 3GL tools are inventing the future. They don’t often agree with each other but they certainly have the advantage over the 4GL creator. These guys have to wait and see what technology actually matures beyond the marketing hype and into mainstream best practice before committing to provide code generation for that area.

Flexibility. Languages at 3GL level, depending on the targeted platform, have virtually no restrictions with the type of application that can be written and how they are written. This means that applications where speed of performance is the critical measurement of success then it is most likely that a 4GL will fall short of the handwritten targeted code.

The Pro’s of a 4GL

Business rules focused development. Once you have learnt the code generators quirks you are in a situation where you mainly tackle your development from the business domain and you allow the code generator to handle the technical implementation. With this comes a significant reduction in the amount of time required to build an application. Many will say that there are standards and frameworks that help with 3GL development. This is actually quite true, but, also be aware that the code generator vendor will be skilled with the major best practices and will write more consistent code. Some may argue that the code is not as neat as code written by a good developer and in the regard, I quite agree. I will say that the underlying code will be written in the same way and style, therefore, after a while all the developers will become conversant in how the code is generated, that is, if they want or need to understand. (See Below)

Complexity avoidance. A 4GL will protect the majority of the developers using the tools from the underlying complexities of the generated language. When you couple this with the ability to influence how the code is generated using patterns, have the ability to take the design model from the 4GL and transform that into other language code, your business logic can truly be ported from platform to platform as trends become reality and your technical needs change.

Impact Analysis. For me this is one of the key features of using a 4GL tool. Generally these tools use a database to store design and program artefacts that are then transformed in the language code. Every reference for every field, File/Table, Access Path/Index/View, Function/Object/Program is stored in the repository and a developer can track each and every item through to where and how they are used. This is a powerful feature that cannot be overlooked versus manual reviewing of language source files.

Trusting the generator. When I train people to use CA 2E or CA Plex the defining moment for gauging the developers progress and understanding is the day that they learn to trust the generator. As with any tool, a badly constructed function in 2E, for example, can create badly generated and non compilable code. Once the developer realises that it is generally their fault if a generation of code fails they’re ready to move forward. If have seen far to many 3GL programmers migrate to the 4GL paradigm only to get bogged down into the details of the code produced, yet they will trust the compiler without hesitation. With the ability to change a shared function or the domain of a field and then apply detailed automated impact analysis to identify all affected programs, press a button to regenerate and compile all programs and database files affected is a very powerful feature.

The Con’s of a 3GL

Slower, more expensive development. The very nature and size of modern 3GL languages and their flexibility is also their Achilles Heel as there are so many ways to resolve a programming issue with literally thousands of opinions and many directions. In a nutshell for certain types of applications, particularly those that involve the extensive usage of a database, the ROI for using a 3GL versus a 4GL is very poor indeed. To contra some of the cost debate, 4GL tools are generally more expensive to purchase. The most expensive item in any development team is the human, even if it has been outsourced to an emerging development powerhouse.

You will spend more time debugging the application. A very good ex-colleague of mine once said “If the art of debugging is the removal of bugs from programs, then programming must be the art of putting them there in the first place.” Because we are relying on the developer to code all aspects of the application it is likely to cause some issues along the way. It is generally the developer’s prerogative to deal with memory leaks and usage in languages like Java or C++ but with a 4GL it would be the code generators responsibility.

Complexity. Once again due to the size of the languages and their strong reach it is unlikely that you will find developers that know all the aspects required to complete an application. Your staffing needs are generally much higher and the learning curve for the 3GL would be very significant indeed. This means that the developers must understand many technical as well as business problems.

The Con’s of a 4GL

Vendor lock in. Depending on the vendor this can be quite a significant issue. If the vendors are too slow to react to emerging technologies you will find yourself with a heterogeneous development environment and you will lose many of the advantages referred to above with regard to complexity protection and highly detailed impact analysis. Worse still, your vendor may well decide to stop production of the 4GL or chose other directions as the options with technology deployment balloon. These tools are often criticised as proprietary.

Flexibility. There will be limitations with the scope of applications that can be created by a single 4GL. There are of course others that target different platforms and purposes. Their flexibility is often measured in the lowest common denominator for which they have to support/generate code for. For example a generator that generates code for three different platforms may have to limit what can be done in one language due to limitations in another. For example different languages may have differing maximum field lengths meaning that for generic code construction in the 4GL platform x and y can only size fields to the limits of platform z.

Source Code. Many 3GL developers will argue that the code is not user friendly, bloated and often too generic in comparison to hand-written code. This can be true of some code generators and is certainly something that needs to be considered when choosing an approach for your development.

All of the above are by no sense of the imagine a definite list. Given time, I believe that I could have produced a list of 20+ Pro’s and Con’s for each approach.

Part III will discuss trends, fads and conclude the 3GL and 4GL debate with my own personal viewpoint.

Thanks for reading.
Lee.

Wednesday, April 30, 2008

The Great 3GL v 4GL debate - Part I

Ever since development languages were invented we have sought ways of making the development of software easier. We have attempted to do this by abstracting the level at which the developer is employed to create code and created languages and tools which are more 'natural English' in terms of human interaction. However, on the other hand we have also added to this extra levels of complexity with changing hardware, communications protocols, multi-tier server deployment, runtimes, middleware, messaging technology and language politics and I haven’t even bothered to discuss the internet.

Regarding language politics, read anywhere on the internet about the great .NET or J2EE debate or perhaps commercial languages versus open source and you will quickly realise that there is significant inroads to be made with IT vendors around the world. You will see an IT community that is split pretty much down the middle, although if you want my humble opinion as it currently stands, I believe that we will once again see a shift towards packaged and guaranteed software over that of open source and Microsoft will eventually win the development language tools war.

This three part article aims to discuss the evolution (not revolution) of software development languages with particular focus on third and fourth generation languages, a debate on the pro’s and con’s of these approaches and then conclude with a few comments regarding some of the repeating fads as I see it today.

It wasn’t that long ago that the typical software developer would have been aged between 35 and 60, male, probably balding (So that’s me covered), university educated and employed within those same hallowed institutional walls since passing his exams, quite ironically with his non IT related degree. He would have been wearing white coats in the office, have bottle bottomed glasses, a pocket full of pens and answered to the name of geek or dork.

Well this is how Hollywood and the urban stereotype would have it.

A bit harsh if you ask me but to be fair, they would have been fascinated by punch cards, saw value in paper tape with holes in it and probably would have missed any fads of the times with regard to musical revolution. There certainly would have been very few ordinary people and the numbers of women specialising in this field, countable on the one hand.

Now, time has moved on, as has technology and you now can’t tell an IT guy apart from your ordinary office worker. It actually amazes me that although we are making the art of software development easier, the extra layers of complexity should in theory have amounted to a increase in the numbers of geeky looking guys, so much so that if lined up ten abreast a communist regime would have been proud to show off their IT military might with these millions marching in city squares across the world. But this hasn’t happened, IT in general is now a mainstream activity and the working environments are certainly more aligned to that of a typical office environment. With this mass adoption of IT skills in the work place I also believe that IT guys are now considered a corporate commodity, where as 15 years ago the pay would have been relatively higher, how times are changing.

So we have worked hard to improve the scope and productivity of the average software developer. We have migrated from the punch card era to having keyboards, mice, laser pens and voice recognition input devices. We have languages that have evolved to make them more readable and understood by a human. The days of everyone programming in assembler or other low-level machine/processor level code began to change with the introduction of the 3GL languages of the day. COBOL, Fortran, RPG and Basic would be good examples here. I am sure that at that time some people would have embraced the new paradigm as much as developers have embraced Java or are now embracing Flex/Actionscript, Ruby on rails or C# as the perfect way forward. There would also have been the doubters and I guess the split would have been no different to many of the impasses that we see reported online and in periodicals every.

Still, software engineering took time.

We are improving and continue to improve 3GL languages to this very day. We now have a whole hard drive full of productivity features embedded within our integrated development environments (IDE). Features like wizards, auto code completion, and syntax auto-correction were non-existent back then, let alone globally accepted standards and minimum requirements.
I would say that any developer working 20 years ago would never have thought that freeware/open source (delete as appropriate) products like Openoffice or Eclipse would be a reality. They could have conceived that software was given away as a loss leader for professional services, but, a massive corporation like IBM giving away a product that it spent and to this day still spends millions of dollars on would have been considered insane. But this is the state of play today.

So when many thought that we had gone as far as we could with the evolution of the 3GL language we once again raised the bar with the next great technology advancement. This time we evolved to 4GL languages. These are otherwise known as code generators, CASE (Computer Aided System Engineering) tools or ARAD (Architected Rapid Application Development). This was hailed as the end of the expensive IT developer, the marketing expressed that the typical end user could now get involved in the development of the IT systems and return the ownership and power of your systems back to the business, and more importantly drive it out of the hands of that lowly IT department.

The same IT department that through these times was still considered a cost overhead rather than a business opportunity enabler. Many of you may remember the days when the IT function reported to the financial controller. I believe that most IT people are artists who can’t draw and we use the creative parts of our brain to build beautiful code and systems. To think that you’d stifle (some may still continue to do) this creativity with the frigidity of accountant mentality still frightens me. Imagine the marketing or sales director reporting to that same accountant? Actually I can, ouch!!!!!!!

With the marketing hype, 3GL project overruns and increasingly tight deliverables the 4GL era was born and in my view this has created some of the more interesting debates in IT circles. The simple reason being that I would anticipate that for each platform/system available there would be numerous languages that are either compatible (Java and the JVM) or targeted (Compiled) that are considered the language of choice, each with their own hardcore developer following. There will also, more than likely, be a 4GL that targets that platform and I bet my left one that a maximum of 10% of the users of the platform use a 4GL over that of the 3GL.

Are these 10% the visionaries?

Well I guess that depends on the tools of choice, but no one denounces the 10% of personal computer users that use the Apple Mac and all its gizmos.

You also have to consider that many of these 4GL languages evolved during a time of single platform computing. i.e. There would be a 4GL that would target the complete application development cycle. The tools were capable of constructing everything from the database, screen and reports though to catering for the applications menus. I have had experience developing in both 3GL and 4GL languages and I believe that I am well placed to comment accurately about both approaches. So as IT has evolved so have many of these 4GL tools.

The question is do you choose a 3GL or a 4GL?

This is still a fiercely debated argument online or at technology conferences just as much as the debate around the merits of client/server technology versus thin client or betamax v VHS (lol). With the emergence of more and more technologies and web 2.0 we are again beginning to witness the thin/rich client gloves come off. Which for me is quite ironic as web thin client was the reason for killing off the high deployment cost of client/server systems which itself was created to offset performance issues of software systems and distribute the processing load.

That said, cost is now measured in bandwidth and reach rather than hardware and employees required to support the system.

I personally believe that these architecture choices should be down to the type of application you’re creating and its accessibility and user requirements. Also, this is the same thinking behind why you would choose a given development tool and at which level of abstraction you wish to develop the application. Another interesting topic involved with the 3GL v 4GL debate is that many of these tools are capable of producing code for multiple platforms i.e. IBM Power System (RPG), Windows (C of one variant or another) as well as Java which is capable of being deployed on multiple platforms.

Java claims a write it once, deploy it many times approach. I would say that it should be rephrased as write it once and the tune it for each platform, JVM or application server of your choice. Now I make no bones that I am an advocate of the 4GL (especially CA Plex or CA 2e) over the 3GL for the applications that I have written over the years. Most 4GLs cater for the RDBMS systems and are best suited for these types of environments i.e. banking systems etc. Other 4GLs or tools for writing computer games are in existence and once again these are designed to protect the developer from the underlying complexities of the code. With these engines you do not need to understand the ins and outs of DirectX or DirectDraw API’s or the language that is generated. But your decision to use one of these tools must be twofold.

1. It must be appropriate for the type of application you are creating.
2. Once you have chosen the 4GL you must stick to it and use it properly.

There are many tools out there that claim that they can generate code into multiple languages and these tools in my opinion are great for ISV’s that need to have an offering across multiple platforms to negate the hard sell of one technology over another. After all, shouldn’t your marketing and sales teams be selling the values and merits of your software’s function and feature set rather than justifying your company’s technology decisions

Part II will discuss the many pro’s and con’s of the 3GL and 4GL languages and tools.

Tuesday, March 18, 2008

The new millenium Bug?

There are only 17576 combinations that can be considered when allocating a TLA (Three Letter Acronym) for airport codes. Part of the challenge is that the code should also be meaningful and identifiable, for instance, everyone knows that London Heathrow is LHR and that Berlin in Germany is BER.

If you don't believe me take a look at this site http://www.world-airport-codes.com/.

After a while some of the codes appear confusing. Hwanga in Zimbabwe has the seemingly obvious code of WKI. I assume this is pronounced Wiki.

This may be of interest to some of the IT geeks reading this, assuming of course that the introduction of Google’s Knol has/will obliterated the Wiki concept. I can never work out why open source stuff like this "Wiki" is so damn difficult to maintain. I guarantee that Google or Microsoft will make this easy for Joe Bloggs general public to use. I can personally hear the death knell for Wiki already, largely IMHO its own fault for keeping it geeky and for the myriad of different syntax styles that are available.

Anyhow, back to airports. With over 9000 airports registered in the database to-date and our insatiable appetite to travel around the world, it is likely that more and more airports are going to be built, each requiring yet another unique meaningful code.

Presently, these codes do not include numeric characters so the basic math tells me that there are 26x26x26=17576 combinations available. This is stated with the assumption that unlike car license plates, we do use every letter available in the alphabet.

So what is going to happen come the day when we have used up all these codes. We could begin to use numeric characters, however, the numbers 0,1,2,3,5 and 7 are unavailable due to their similarities with the O, I ,Z,M (sideways), S and L. Also, unless we have taken a big step into the future, a code like KN9 really sounds like a it should remain in a novel by Arthur C Clarke rather than a domestic airport in deepest Taiwan.

That said, there is more than one way to skin this cat.

We could be tempted to extend the size of the code from say 3 characters to 4, or perhaps more. However, this will require a huge amount of effort to synchronise all the airline ticketing systems around the world, not to mention:-
  • Online and published guides.
  • Signage (i.e. Welcome to LAX).
  • All those travel agents whom for years had remembered these codes.
  • All those flight anoraks who have travelled to every airport known to humankind.
  • The humble fan website and all those pub quiz questions that have been written and are now negated.
All this hassel because someone decided to save a byte or two when naming the airports in order to save, at the time, valuable disk space. The irony being that this is the same disk space that the likes of Google and Yahoo are giving you gigabytes of just to sign up for an online email account.

It doesn't stop there though, what about the issued tickets that are already in the public domain. The transition period for change over would be huge (up to a year). So now we have to include all those check-in staff and the baggage handlers who now have to remember two codes for every airport into the debate.

I would suggest that the majority of those 9,000 airports have been created in the last 50 years. I find it quite daunting that we might experience the aviation equivalent of the millennium bug. This may not be that far off and once the developing nations reach full steam ahead with their expontential economic growth, you may well find yourself employed in the future to sort out the code written by those legacy developers.

Those same developers who didn't have the foresight to cater for tomorrow’s usage.

When we think about it, this has happened before. It was 20 years or so ago when it was concluded that 640kb of RAM was more than enough for any computing requirements in the home PC.

And those guys from the 70's that designed these airline systems have a lot to answer for. Not only did they earn good money back then with job security (outsourcing wasn't invented or trendy then). They now get rewarded for coming back in and fixing up their issues many years later.

So get travelling now. There might be some downtime in this industry and remember, someone has to pay for all this development. I pray to god (actually I don't as I am athiest) that you are using a 4GL like 2e or Plex to maintain this code. If you are using a 3GL you might have quite a lot of impact analysis to perform first.

Remember, you need to be extra cautious with your design and field domain management and regardless of what people tell you they want, look into the future and get it right first time.

Watch this space. You heard it here first.

Thanks for reading.
Lee.

Monday, March 17, 2008

What do you do for a living?

This has to be one of the most common questions asked of anyone in life. Apart from, How are you?, Can I buy you a drink? or cringingly, Do you come here often?. Well, this isn't an article about chat up lines or dating gotchas. I am long past all of that.

However, many people can simply reply “I am a plumber” or “Nah, I’m a sparky geezer!” (Electrician), or perhaps they might say "I have my own business selling cars" or "I work for a bank doing banking stuff". The point here is that no matter what they do, their audience will immediately be able to understand what they do and if they need their help or services, they can simply ask.

For the average IT geek, this is always a tricky and preferably avoidable question. We tend to shy away from disclosing our job because we are concerned about the impact of this little snippet of knowledge in the heads of a non IT savvy person.

There is a common phrase in IT that goes something like, 'A little bit of knowledge is a dangerous thing'. Actually, I guess this is true, in general. DIY being a good example.

As IT professionals we tend to try and answer this question ambiguously.

Mainly because we think that what we do is so very specialist and complicated, we also make allowances for the questioner as we believe that they will switch off. We have a primeval fear that we will not be able to complete communicating the fluffy, pinky greeny codey stuff, about why we love our job.

On this note, I do appreciate that in all professions there are general conversations and then there are the technical jargon and insider acronym riddled low level conversations.

As IT professionals we have invented more TLA's (Three Letter Acronyms) than any other profession, possibly with the exception of airport abbreviation naming committees.

Anyhow, a typical answer would be “Urrrrm, Computers”.

“Arghh, Right!!!” comes the reply, quickly followed by “Can you take a look at my computer?”.

And this is it, the single biggest fear of an IT professional. Your job might be that of a patterns and framework designer for J2EE or you may be a Mainframe performance specialist, but rest assured the simple mention that you work with “Computers” means that you are now their personal technical support helpdesk, for life........

Now, by contrast, our plumber and electrician are both in the home building or renovation trades, but, you never hear me asking them if they can do some plasterboard stopping, tile my roof or fit double glazing.

I guess that over time the general levels of understanding of the different roles within IT will improve. However, until this day has arrived I have learnt the hard way to always reply in a precise and exact manner.

"I specialise in software application modernisation, building and shaping high productivity development teams to meet the demands of developing enterprise business applications. I also provide bespoke consulting and training services and expertise in utilising multi-platform 4GL code generation tools.”

Now, for all but the most technical people out there, I tend to get that ‘lights out’ glare about halfway through that sentence, but, on the plus side, I also no longer get those requests for on the spot computer repairs.

Thanks for reading.
Lee.