Showing posts with label Support. Show all posts
Showing posts with label Support. Show all posts

Saturday, December 1, 2018

What is a reasonable time to resolve an issue?


Today, I experienced a bug whilst building a wide screen definition in 2E.  The build was okay but when I went to use it (in the 2E device design editor), I was getting some low-level errors.

Upon googling the error, I came across this ticket which gave a few workarounds, one of which I implemented.

The given workarounds were:-


  1. Set the screen footer to 25 as per the screen print. 
  2. Set the subfile page size for consuming functions explicitly.  I am assuming they mean…
  3. Override the YSFLEND model value to *PLUS and not *TEXT or override it in the consuming function via F7=Function Options



All of these appear to be perfectly good reasons to postpone fixing this issue up within the product as I am guessing...


  • many use the + for subfile,  
  • many also forget to move the command line down to row 26 anyhow when creating and generating wide screens (they leave it at 23) and then wonder why they have a 3 row gap at the bottom of their screens 😊

Is it okay though, that 15 years after it was raised, it is still an issue…..?



Thanks for reading.
Lee.

Tuesday, November 10, 2009

Enhancement Requests

Well it is almost that time of year again.  You know.  Christmas!!!!   We all sit back and relax, eat turkey (far too much) and discover that port is okay to drink by the bottle after all.

But putting that aside it is also nearly the new year and that will mean the annual CA enhancements voting survey.  So many of you know this annual event and the results are collated and influence the R&D effort. 

Certainly a .NET client for Plex came top last year and this is what we saw at the conference albeit in technology preview mode.

Whilst not everything on the list can get done.  CA tend to try and get the balance right between the list items that rank highly and the market direction of the product as they see it too.  Which is fair I guess.

My call to you all today is not that the enhancements are starting soon and you should al be reaching for your new 2010 Pirelli calendar.  That's for Bill and the team to decide.   However, I am saying that the list is made up of requests that we make to the CA support desk.  i.e. no requests equates to a smaller list of items or last years ones (some of which may still be very valid).

My challenge to the community is to think through some of your preferred enhancements and ensure that you register these via the support desk online so that they make the list for you to vote on.  I am also aware that CA is planning to extend its arm further into the user community with a product advisory type approach involving some of the clients.  I am certainly looking forward to contributing to this.

However, in the meantime, start raising those tickets and remember when you are voting you are voting for your top ten in the order you want them.  Some people have been confused by this in the past..... 

Thanks for reading.
Lee.

Tuesday, June 16, 2009

Calling all French 2e Users

Just a quick blog today.

CA, actually Daniel Leigh in particular, is looking for Beta testers for the new 8.5 version of 2e. This time for the french language version.

So all my colleagues in 'Les Bleus' country. You can contact daniel. Just let me know you are interested via the comments feature and I will pass on your details or you can contact him directly at:-

daniel.leigh at ca dot com.

Good luck. There are certainly plenty of great new features in this release to get your creative juices flowing.

Thanks for reading.
Lee.

Sunday, November 30, 2008

2e and your support network

Updated - 12/06/2009
I recently received a communication via the blog feedback feature. It was from one of our colleagues in France who was a little concerned about the support he receives from CA and also the usage of the tools in general.

There are quite a lot of interweaving discussions to answer his question below.

"In France,we have CA/2E 8.1SP1fr RPG.Here,we have no news from this product,it seems dead! We have to contact US support to obtain some news/upgrades,it's very difficult to have a schedule of upgrades.We use 2E since 95 and nothing has changed in the way of using it. Please reply to discuss. Thanks. "

So I guess I need to answer each of these and share some opinion along the way. After all, isn't this the point of a blog in the first instance.

Is it Dead?

Well the short answer to that is "NO". A quite emphatic one actually. CA have a roadmap for the 2e product with a host of community voted enhancements being included in 8.5 (GA July 2009, I understand). I am sure these were discussed at CA World. There was also a call from Daniel Leigh recently to the community about utilising Web Services directly in 2e as well as catering for the creation of ILE service programs from within the toolset.
I have been part of the Alpha and Beta testing and these is certainly some good potential in these areas.

Keeping updated about the product.

CA is a big company and I would guess that the local offices are often the last to know about product centric announcements. The guys responsible for the 2E and Plex products are particularly vocal online and via a series of user groups. The US is the biggest installed base for these products so I guess it is only right that the focus starts states side. You can sign up to the forums, my blog and other fan sites as well as the Plex2E PLC (Product Line Community). Not to mention the local user groups in some regions. US and UK.

There is also Alpha and Beta testing programmes which I encourage you to consider.

Other useful links below which I have also added as a dedicated link on the blog home page.

The CA Forums

http://caforums.ca.com/ca/?category.id=caplex

Communities Product Line Community

http://causergroups.ca.com/usergroups/UserGroupHome.aspx?ID=391

Also check out the other links I have on the blog home page. I link to the Plex and 2e Wikis which are resources we can all contribute to as well as specialist sites of key CA product partners.

Once you are a member of the PLC you will receive very regular updates from Bill Hunt and the team as well as invitations to webcasts.

And then you always have the other forums around the platforms the tools generate for. i.e. SystemiNetwork, although I guess this will be changing its name again soon once the Power System branding gains momentum.

Upgrades.

CA have explained in the past on numerous occassions and the release schedule would confirm this, they issue major upgrades for the products every two years. Service packs in between which add functionality as well as a fix roll up. See the CA website for more details about each of the product roadmaps along with release and support cycles.

On a side note SP2 has been available for quite a while (November 2007) and as I referred to earlier 8.5 is GA for July 2009.

Features introduced into 2e in recent years.

The following link (on the 2e wiki) highlights the changes that have been added to 2e in recent years. If you are considering or using Plex also then you should check out it's wiki also. See the links at the top of the blog.

http://wiki.2einfo.net/index.php?title=Versions

Depending on how you use the tool or how you have approached extending the tool would dictate which of the features you are using. Often, as the tool has been around a while we just use it as we always have. Not too different to those of us who use Word as if we were using Word for Office 4.3 instead of 2007.
If you are into componentisation of your 2e systems then you will be utilising many of the other features. However, there is no right or wrong way to use the product, simply what works for you. But do take a look at what is there and see if the new features can be applied to your environment.
I was quite shocked to see on a recent PLC 2E users survey that many shops are still on 7.0 and some on earlier versions.
My strong and hence bolded recommendation here is to get current.

Certainly 15 years ago the focus was that 2e generated every aspect of your system. Nowadays if forms part of a hetrogenous platform of servers and tools.

The base tool has remained largely the same with the focus on value add but 8.5 has quite a few base tool enhancements. I like the filtering now. Very neat. And recent releases have had quite a lot of neat shortcut function keys and subfile options that can increase developer productivity.

Lastly, I guess there is also the advent of the internet. With so much information available, if we are not sourcing our information via the net then we could be missing out on lots of it. And if we looked closely or joined the PLC you would know about the 4th Annual conference taking place in Ft Lauderdale, Florida, USA in September 2009.

Thanks for reading.
Lee.

Monday, June 30, 2008

Knowledge capture & use in technical support communities - Part 2

In Part 1 I discussed the problems facing the technical support team with overworked experts and a need to transfer their knowledge as efficiently as possible.

In Part 2 I will discuss how to successfully capture and store this knowledge in an efficient and, above all, useful way. I'll lead off with a brief overlap from last time as a reminder of where we got to.

The 'Virtual Expert'

From what has been discussed so far, it is clear that expert knowledge is required, but that tying up the expert in this process is seen as unproductive in the current climate. We cannot get away from requiring time from the expert, but we can minimise this time and capitilise on it by recording the knowledge in the right way.

The answer lies in recording the expert knowledge (on paper or, more usefully, electronically - see later) in such a way that it is as close as possible to the over-the-shoulder commentary.

There is often still a need to use numbered steps when accomplishing a task. Such steps provide structure and sequence and help with mental tracking when performing the task. There is no reason, however, why each step cannot contain more than simple 'input, output' or 'action, reaction' type information.

For maximum benefit, each step should be written in conversational language and explain what the user is doing, why they are doing it, what the expected outcome should be and at least make reference to any unusual, but known, variations.

Furthermore, before any of the steps, there should be an introductory section which describes why the user would perform the task, what pre-requisites there may be and definitions of terms, systems and the like. After the final step, make mention of any further tasks that may be a logical progression from the task described, but which do not form part of this process.

Don't take anything for granted

Whilst we are talking about capturing expert knowledge, it is important not to lose sight of the basics. Any documentation is devalued if it makes too many assumptions. In creating a documentation repository, an audience level should be decided - such as 'technically competent', or 'beginner' - and all documents should be written for that lowest common denominator. It is easier for a more expert user to skim over known material than it can be for a new person to work out the undocumented basics.

It is important to include examples in the documentation. Where possible, have the example show the most common scenario, as it is most likely that staff new to the task will use the examples. It is also worth giving additional examples if there are significant variations in a step. Providing examples helps the user to get closer to the over-the-shoulder situation.

The ultimate test for the documentation is to give the process to a person who is at this 'lowest level' and have them perform the task. You will be surprised by some of the information you have taken for granted in your early drafts. I know I was.

Structuring the documentation

For ease of maintenance, it is important to only ever store a piece of information in one place. To help achieve this structure, it is useful to allow for two document types in the repository - reference documents and process documents.

Process documents contain steps describing how to perform a task. Reference documents contain (mostly-) static information that supports one or more processes.

It is often necessary to refer to tables of information (such as a list of files, describing their usage) from more than one process document. By separating this type of information into a reference document, it can be referred to by multiple process documents without increasing the maintenance burden through multiple copies. Additionally, when the table requires maintenance, it is easier to locate (residing under its own title) and the maintenance can be performed without danger of corrupting the process documents. When properly structured, maintenance of the reference document can be accomplished without knowledge of the referring process documents.

Whilst reference documents tend to represent pure data, it is still important to keep the conversational language in mind. There may be naming conventions or other conventions which are being followed for the data and it is important to note this in the reference document to complete the picture for the user of the information, and equally importantly for the maintainer.

It is also beneficial to factor out sub-processes into separate process documents and refer to them from the major process documents. This is of value where a sub-process is part of more than one major process.

The major benefit of having information in only one place is realised when errors are amended or updates are applied. These have to be done in only one location and all related processes are automatically catered for, as they simply reference to this single occurrence.

Capturing is understanding

The process of capturing information is time consuming and is best not left to the individual experts. Remember that they don't have that much time. Also, too many authors can devalue the repository by differing styles and levels of language.

The best solution to this is to have a single person (or possibly two or three) to build the documentation repository. This co-ordinator is then responsible for collation, setting style and keeping the language consistent. This person should not be expected to author all of the documents, but must be able to understand at least the broad concepts involved in order to ensure that appropriate structure is followed.

Each expert should be expected to provide a draft of the process or reference data, in a form approaching the final requirement. In some cases, where the co-ordinator's knowledge is good enough, they may author the document, but it should always be checked by the relevant expert.

Electronic storage for fast access

Following the structuring process above introduces one significant disadvantage in a paper-based documentation repository. Frequent referencing to other documents causes the reader to flip pages or have multiple documents arranged on the desk in order to complete a single process.

In Part 3, I will discuss some real world solutions to electronically storing, maintaining and delivering the captured knowledge.

Thanks for reading.
Allister.

Tuesday, June 10, 2008

Knowledge capture & use in technical support communities - Part 1

This three-part article is adapted from one I wrote almost 5 years ago when much of what you will read about was fresh in my mind. This adaptation addresses only the passage of time and some points of style and meaning for a wide audience.

Whilst software development is the subject of this blog, let us not forget those who (typically in large organisations) support the developers and others.

The nature of technical support communities.

Technical communities come in many forms, be they design teams, development teams or support teams.

Whilst design and development teams are largely about the creation process, they still have many day-to-day activities which are defined and repeatable. Support teams, although fulfilling an entirely different role, often have to create on a very short-term basis. So it can be seen that the different types of teams have similar requirements.

However, the support team seems, most often, to be the one to get out of control. The difference is that the support team is always working on a short time frame. In addition, support teams often become involved in project work and this adds to the complexity of the day-to-day activities, as the time frames are shortened still more.

Most often, you will find that staff in a support team are very good at what they do - they have to be to survive. Unfortunately, the higher the skill of the staff, the more reliant you are on those staff to keep the systems running. It is a difficult and time-consuming option to bring 'green' members into the team.

How many support managers have not recognised that documentation is a key part to the support process? I would wager very few. Fewer still, I propose, have succeeded in completing the documentation requirements within their team and reaped the kinds of benefits they were expecting.

Documentation, to the 'tech', is a four-letter word. I, myself, recall asking the question "Do you want me to document it, or do it?" Simple economies prevent the techs from having enough time to complete the documentation task and many welcome this excuse not to do it.

Another trait of support teams is the experts. In virtually any support team, there will be experts in various disciplines. Most often, however, these experts are relied upon to provide most of the resource in fixing problems in their area of expertise when they should, in fact, be called upon to share their knowledge.

Shared knowledge is a powerful tool. Experts will always be needed when particularly difficult or unusual situations occur, but the team as a whole should be able to leverage the experience to improve task turnaround times through a more even spread of the load.

Knowledge transfer

It has been documented in studies that the best way to learn something is to have an expert stand over your shoulder while you go 'hands on'. The reality of the situation in front of the learner, coupled with specific and pertinent comments or instructions from the expert gives the learner an experience often indistinguishable from the real thing. The learner also has the opportunity to ask direct questions in the context of what they are doing. Book learning, on the other hand, can only go so far with static examples and predetermined situations.

Perhaps the most important aspect of 'over-the-shoulder' learning, however, is that the expert is unlikely to simply recite steps by rote. There will be an accompanying commentary and usually a significant amount of reasoning on why things are done that way. This is very important in equipping the learner for when things do not go to plan.

Learning the steps of a process by heart is well and good when the process works. Most often, however, processes do not cover all possibilities and the rote-learner of the steps is going to come unstuck when an unforeseen, or simply undocumented situation arises. Unless the learner understands why they are taking the steps and what they should be achieving, they are almost as much 'in the dark' as prior to learning the steps.

Having knowledge about the nature of the process and the goings on under the covers helps get through many small deviations from the norm and also helps in issue resolution, as the learner is able to return to the expert with an hypothesis, or at least having done some basic checks suggested by the nature of the operation.

The key issue with this type of knowledge transfer is that, in the majority of cases, the expert is already overworked and has no time to spend standing over shoulders.

A secondary issue is that the expert may have to impart their knowledge, over time, to a number of different people, and this is inefficient.

The 'Virtual Expert'

From what has been discussed so far, it is clear that expert knowledge is required, but that tying up the expert in this process is seen as unproductive in most situations. We cannot get away from requiring time from the expert, but we can minimise this time and capitilise on it by recording the knowledge in the right way.

In part 2 of this article I will go into methods for capturing this knowledge in the most effective way.

Thanks for reading.
Allister.