Friday, December 16, 2011

The Sorcerers Apprentice.

Hi,

I have been aware of the 'Varianted Source Code' models that Roger Griffith has made available for the community for quite a while and today with the release of version 1.1 which has C# client code support all ready for the GA Release of CA Plex 7.0.  We now have some additional sources :-).  FYI - I don't know when this is coming out but it can't be far away now :-).

See the link below.


There are always Pro's and Con's of any approach but this solution seems pretty fair to me. Obviously, if you are a C++ and RPGIV typical client server customer and not changing in the near future then this post will be of little relevance (or is it - read on).  I can think of only one other small issue in that if you prefer to use triples for setting the language at a function level rather than variants then this might need some re-factoring.

If, however, you are planning on migrating your client (or server for that matter) to c# .NET/Java etc you may as well start to reduce your impact for the big move.  By replacing your own source code calls for the shipped code supported by the varianted models sooner in the process you can only win. Remember these are available NOW and can be used with Plex 6.1.

This only leaves your own 'personal' or 'application specific' source code.  A little like User Source in 2E sometimes these are just fit for a purpose and not migratable.  Others simply may need tweaking.  Not a bad time to do some impact analysis and determine whether you need to brush up on some C#.

Therefore, I have a small challenge for people.  If you have some useful generic Source Code.  Any language (Java, C++, RPG IV or C#), let's share it and see if others can convert it across all the platforms.  Then Roger and his team could keep the shipped models up to date and everyone's a winner.  For every routine that is sent to me I will PERSONALLY ensure that they are all converted to relevant C# code.  Any volunteers from the community to pick up C++, Java and RPG IV?  I can get these done but not as quickly I am afraid :-(.  This is courtesy of my bosses at www.sasit.co.nz

I'm thinking of advanced String routines like.


  • Count xString occurs y times in zString
  • Replace Double Spaces with single space
  • IsNumeric?
  • IsAlphaNumeric?
  • PasswordStrength
  • Is CharactersOnly
  • CalculateStringLength
  • ReserveString
  • SentenceStyleCase
  • IsEmailAddress?
  • Replace xString with yString in zString
  • GenerateRandomNumber (Start, Max, ZeroFill?)
  • GenerateRandomString xLength


I'm sure that this list could get quite exhaustive (Suggestions welcome) but I am happy to co-ordinate this effort if people just add some stuff they want to share to the comments section.  So cut and paste some code and a description or just make a request.  

Thanks for reading. 
Lee.

P.S. Disclaimer - if code is Copyright please include and acknowledge source etc.

Friday, December 9, 2011

Community Links

Added a few links.  Lucio's site and Desynit's Feed Aggregator for Plex and 2E .

That's them.  The community ones.....

Tidied a few broken links whilst I was there. If you want yours added. Drop me a comment in the comments section below.

Thanks for reading.
Lee.

Monday, December 5, 2011

Translator for CA Plex project!!!

Translator required!   Übersetzer erforderlichTraductor necesario!  
Traducteur obligatoires!   Vertaler Verplicht!   Traduttore obbligatori!
翻译 ! अनुवादक आवश्यक!Þýðandi Nauðsynleg! Translator Kinakailangan!!
Traduttore obbligatori!   翻訳者​​必要

Some of you know that I have been working on a small project with .NET C# and the CA Plex Model API.  The project was for an entity creator and I have decided that I wanted to learn and understand a little more about localisation with this technology too.

Here is the link - Entity Creator Blog post.

Well the good news is that I worked it out...... The bad news is that I struggle a little with the actual translation.

I struggle with two areas.

  1. Context - Everytime I type a small sentence or single word I get a few choices (Google Translate) which doesn't help me one little bit.
  2. Plex centric words - I know that many of us use the Plex terms from an English perspective.  i.e. I assume we all know a parent child relationship as "Known By".

My call to the community is that I would like to internationali(s/z)e the entity creator before I aim to get it open sourced for the community to extend (if you want to), or to discect if that is also of interest.

So calling all German, Spanish, Danish, Indian, Tivolian or other language converters in the community to assist with the translation of around 40 or 50 literals.  If you are interested please email me via LinkedIn (See link on blog) or put a comment and your email (I won't publish it) and I will contact you.

Thanks for reading.
Lee.

Friday, November 25, 2011

The great CA Plex 'Entity Creator' - Update


What started out as a wee (little) test application for me to learn and understand a bit about DotNet C# and the CA Plex ModelAPI has blown out (my obsession) into a full featured Entity Creation Utility.   I am also fully aware of the differences in terminology and approach between us Plexers and DotNetters which has been really useful for my company as we have a whole host of different development skills and disciplines and I feel like I can cross communicate J

The cool thing is that my employer http://www.sasit.co.nz/ (shameless plug I know) and our clients (current and future) will benefit hugely from this development when they see just how easy it is to enter all the relevant details into a CA Plex application model and generate code for all our target databases.

If I am to be honest there is still quite a lot to finish off like:-

·         Entering Non-Key fields (Next project)
·         Dealing for Foreign Key Relationships/OptionalitySYS and Virtual Fields
·         Advanced validations
·         Error Rollback
·         Schema Import

Not to mention a rethink on the GUI and the application C# architecture now that I have learnt a lot more about this environment.

·         Code layout and structure
·         Object design
·         GUI i.e. A TreeView control or something similar etc

However, for a work in progress (WIP), another one of those TLA’s  I think that it is worthy blogging about and giving some screenshots.  All feedback is useful and appreciated and I believe I have finally fixed the comments section too. J

This all started in a innocent post called RTFM and I will also follow up with some addition posts around the way I went about it and the features of the ModelAPI, along with a few of the traps that I fell into along the way.

The premise of the original exercise was to be able to quickly create a standard CA Plex entity with our default inheritance, Surrogate Key field and some internal naming conventions honoured....  I am now at the stage where the utility handles Surrogate (The DotNetters call this an auto increment key) and/or Natural Keys, File to File key relations, override attributes, labels, narratives and low level field and file/view implementation naming.  A far cry from the original brief and I still have a roadmap as long as my arm.

So take a look below at a few screen prints with some basic commentary of what the utility currently achieves for me and future posts will show you the finished utility (Hopefully).

Overview

A simple model with three entities






























Fig 1. CA Plex Object Browser

These represent a standard Grandparent, Parent, Child hierarchy.  These could have been modelled differently without ‘Owned By’ etc but that modelling approach is debate for another day.  For now the triples that describe this 3 tier relationship have been entered as below.



















Fig 2. The selected entities definition triples.

Taking a closer look at Entity ‘LV3 level 3’ you will see its entity attributes described as











Fig 3. The entity attributes showing no overrides.

i.e. Very simple and no attributes and no renaming of fields etc.

We implemented a policy at our shop of using TLA (Three letter acronyms) to describe of business entities and we like to replace the inherited field names whether they are resolved via a Owned By, Refers to or even as virtual fields.  This is just a ‘in-house’ preference and (we feel) aides with the myriad of people who need to write queries, data warehouse data extraction and other extracts like data downloads and excel spreadsheets.

We also tend to follow through the TLS naming convention for our tables (physical files) and views etc rather than defaulting to the generated names.
















Fig 4. The File and Implementation names for the table and default views.

And I haven’t even covered labels for the fields and entity and field narratives.  As you are all aware this can lead to quite a bit of entry on our part for even a simple entity.

HENCE MY LITTLE UTILITY!!!!!.

I introduce the aptly named ‘Entity Creator’, I wanted to say that with a Boxing Ring influenced Master of Ceremonies voice but realised that wouldn’t translate on a textual blog. 

Entity Creator as described a little above is a wizard that guides the developer through the perilous task of setting up an entity and its fields. (Currently tailored to our environment, but adaptable and extendable enough for others to take a look).

So let’s for demonstrable purposes add an extra level to our setup called ‘LVL Level 4’.
























Fig 5. The first of the Wizard screens.

By not selecting a Surrogate based key we are therefore selecting natural keys and we get an opportunity to enter the ‘Owned By’ and ‘Known By’ relationships.
























Fig 6. Natural keys showing ‘Owned By’ entities for selection.

This shows a selection of ‘Owned By’ and a list of the business entities in the model.  The application stores a list of the business entities so some set up is required for old models but that would be straight forward to complete or I might even just import them sometime in the future.  Roadmap item?

You will see that the keys of the chosen entity are displayed for instant developer feedback.
























Fig 7. Natural keys showing ‘Known By’ field and the options for key selection.

Now let’s enter the ‘Known By’ for this entity.  You will see that a different field set up is presented to the user and you simply enter the field name (The prefix is automatically applied so no need to repeat this).  And the select from a choice of fields (Configuration roadmap item already noted J).

If applicable the character length and decimal places override are available.  If these are left blank the details as depicted on the screen apply through normal inheritance.

Note: As I was new to C# and .NET (DotNet) I decided to hard this to 5 fields.  I don’t regret doing this from a learning perspective but I will refactor the application in the future to perhaps display these in the different way.

Clicking next will take us to the Field Labels and Narrative screen which can be bypassed (if you wish).
























Fig 8. Editing of the labels for a field.

You will see I have indicated the values for the labels and field narrative for the ‘Known By’  The ‘Owned by’ labels and narratives are already present due to the inheritance (assuming you entered them).  In the future this screen will also handle the non key attributes, refers to and virtual fields too.  I’ve just got to get a bit better at Dot Net first.

The final screen is the confirmation screen.

You get to choose what triples/objects are or are not created for your entity and also you can see the statistics i.e. what was created.  ‘Show Summary only’ is the number of objects, triples and narratives and the timings.  ‘Summary and Detail’

Once you have created the entity you should see something like this.
























Fig 9. Options for what we want created and the user feedback.

This is 17 objects, 33 triples and 2 narratives created and entered in 377ms.  Not bad aye!!!
























Fig 10. Some of the new objects, triples and narratives shown in the model.

So I hope you like what you have seen and I thank my company http://www.sasit.co.nz/ with allowing me the time to learn more about these technologies. (Visit our website for ideas on what we can do for your IT business).   I hope that this blog inspires you to look at this technology yourself and I’d be delighted to assist with your projects.

Until next time.


Thanks for reading. 
Lee.

Thursday, November 10, 2011

Engineering Flowchart

Rory Hewitt one of the developers for the 2E product that we all love to use recently posted an image which made perfect sense for solving common engineering problems.


I couldn't help but consider this problem with my 2E glasses on. :-)


Thanks for reading.
Lee.

Sunday, October 30, 2011

A little look at Plex-XML


Many of you may have had read about the Plex-XML framework from Allabout Gmbh.  There have been mentions online (LinkedIn - Plex group or PlexWiki for example).  The guys were at the conference in 2007 and this caused quite a stir in the Plex community.  Well the technology is still going strong.

I decided that I’d take a look.  This is the great thing about having a role which is focused on R&D and application architecture, you get to play with things, this is of course allowed by my employer www.sasit.co.nz who specialise in mission critical systems hosting and development (Harmless plug) so take a  look at the web page.  The pressure comes when you need to back up your decisions, so interesting times.....

My investigation was a case of downloading the relevant stuff from the guys as well as Eclipse (IDE), MySQL (Database), Java JRE (Runtime), Senchas ExtJS (Javascript control library), Tomcat (Web Server) and some drivers etc.  A full list of requirements and how to put this all together is on the Plex-XML website (Wiki).

So after some great support from the guys at Allabout I did end up with a fully working tutorial application.  

A screenshot from the tutorial.
But that’s cheating though isn’t it.  All the hard work was done for me.  I needed to know what all the heaving lifting requirements are in order to evaluate further.  So I then decided to replicate this for one of my own entities from ‘Plex entity entry’ right through to all the configuration files and framework parameters. 

I chose a very basic entity with a numeric key, short description, date, long description (notes) and a status and “NO!!!!”, I am not going to describe every step here as:-

1.       It’s not the scope of this blog post.
2.       It’s already described in the Plex-XML wiki

So I followed the instructions once more and I made quite a few small mistakes with my naming conventions etc so be careful when doing these.  But, I did manage to get it all working once I knew what to change and create.

My new entity :-)

What I liked was the out of the box stuff.  I never coded the date picker or the x of xxx characters remaining stuff.  It just happened!!! J

As I have already alluded to, the pain for me was the configuration files etc and I am chuffed to bits to hear of another company from Germany who have been promoting their utility to help them with the Plex-XML framework.  Te@mconsult.  Check out this link. These two combined make all the difference and I for one can’t wait until the public beta commences.

So ‘all in all’ this was a great experience.  I strongly recommend that everyone takes a look at many of the 3rd party patterns that are available for CA Plex.  Sure it’s great to be a developer and cut your own, but some of these frameworks and patterns from some of the major players have been on the go for a few years now and that effort vs price (if applicable) really is a “no brainer”.

Thanks for reading.
Lee.

Monday, October 17, 2011

Beginners Guide to the CA Plex Model API using C# .NET




Firstly thank you to Rob Layzell for inspiring this blog.   If it wasn’t for his lab exercises I would have done this Model API exercise and subsequent blog using CA Plex and component import.  Now, whilst this is probably a cool thing using a code generation tool to interrogate and update the code generation tool repository (I am sure there must be a word in the OO world to describe this. JJJ).  In the back of my mind I was also acutely aware that release 7.0 is entering beta testing and the much anticipated C#.NET (WPF) client will soon be upon us.  So there isn’t a better opportunity for me to sharpen the old grey matter with some learning than this. 

I chose to continue with the C# coding for the simple reason that it would be worthwhile me understanding what .NET developers need to do as I will ultimately be looking to target .NET for my future applications.  And, of course, enable me an opportunity to learn Visual Studio, some pigeon C# and understand the inner workings of this environment.  All worthwhile skills for when the 4GL needs a little investigation, tweaking or debuggingJ.

We have had C# for server code generation since release 6.0.  Release 6.1 brought WCF to strengthen this offering.  With 7.0 we will have the client side resolved too with the promise of XAML and WPF.  The web service import is another tool that simplifies the integration and consumption of web services, so, all in all, good times ahead.

So back to Rob's unintended blog influence.  His tutorials for .NET used numerous examples of Visual Studio and C#.NET.  One in particular that took my fancy was Lab 10 and related to consuming the PlexAPI COM component and creating triples in the local model.  The ModelAPI is described as being useful for pattern designers in the online help in order for them to automate much of the configuration and setup for using their patterns.  I also see it as an opportunity to ensure that another level of automation is included in the tool.

Anyhow, seasoned CA Plex developers know we already have metacode to influence the generated code.  We also have patterns and inheritance to ensure consistent design or as I call it, conformity to the application architecture.  There are still of course plenty of areas with regard to modelling that a developer could easily get wrong when entering the triples.

In my last blog I alluded to a small entity creation tool that I am creating (UPDATE – This has blown into a full blown too now).  This is to ensure that the fields and files that I enter into the application model conform to our standards i.e. naming, inheritance, default field lengths, narrative being entered and labels etc.  As we are a new CA Plex (long time 2E shop) I am still working on these standards and experimenting but I will share the code and a sample model on request (note that this does currently have a degree of hardcoding at the moment but more than enough detail for people to get acquainted).

For those that are interested in taking this further and perhaps helping me improve the utility, just drop me your email in the comments section. J

For me this (ModelAPI) is too quickly overlooked but if you are serious about CA Plex and modelling then these are exactly the type of utilities (Add-Ins) that one should be considering.  These will further improve your team’s productivity as it is pretty quick and easy to tailor for your environment once you get your head around the API and the underlying model structure.  Just take a look at the StellaTools from George.

I recommend any CA Plex developer to take some time to understand what you can achieve with this feature.  This is quite an in depth subject so I feel that this will eventually become a four part series, so, if you are keen....

Part II – ‘Model API and the model repository’ will cover the underlying architecture of the model.

Part III – ‘Key Model API commands’ will go into detail about some of the core commands that you will need to understand in order to develop against the COM API and sow some of the theory covered in part II together.

Part IV – ‘Some DotNet Tips and Tricks’ will show some programming tips and I hope should be enough to inspire a few of you to ‘RTFM’ with regard to the Model API.

All feedback is appreciated.  Until then.


Thanks for reading.
Lee.



Monday, October 3, 2011

RTFM

“RTFM!” (Read the effing manual) - that was the polite version.

That’s what I was told when I first started programming.  Not every time (obviously), but on that odd occasion where the question was a repeatedly asked one (by me) or that time the more senior programmer didn't eat his oats in the morning.

To this day I still hear this from the more seasoned developers around me.  But I must say that this phrase (I believe) is in steep decline.  Could it be said that my generation of developers may be the last to utter this immortal programming phrase.  You see (or should I say hear) more and more “RTFM” is being replaced by “Just Google it.”

Today I was working with CA Plex 6.1’s Model API.  I was trying to automate a common task that I perform in my model and ultimately cut down on some keystrokes, mistakes and most importantly, ensure ‘THAT I’ conform to the standards ‘THAT I’ have decided and (hopefully) along the way help the other developers in the team J.

I was working from some excellent examples from CA regarding the Model API and thought I’d try a small example using C# WinForms and Visual Studio 2008.  All I wanted to do was create a new entity with appropriate naming standards, fields and keys all with correct inheritance from our patterns etc.  So breaking the problem domain down into small manageable chunks (isn’t that now known as an agile sprint), I decided to get the Entity Prefix (I like these), Entity Name and Entity Inheritance easily created.  The rest will come as the AddIn matures and I implement all the ideas and wizardry in the roadmap. 

As I have said I am creating this pioneering utility using the Model API (Version 3.0) in Plex 6.1 and I will deploy the final program as an AddIn. I have also used this as an excuse to brush up a little on Visual Studio and C# as we are going to be heavily reliant on this IDE once Plex 7.0 comes out and we are all doing DotNet stuff galore. 

I will blog a little more in the future about how I did this (It is only basic and I am still learning) but I have to say that the Model API 3.0 is quite powerful, just ask George Jeffcock about his Stella Tools. 

Here is a sneak peek at the current screen. 



As you can see it is about as simple as one can get, but, without much knowledge of C# I quickly became stuck, stranded, frustrated, challenged and peeeeeeed off.    Anyhow, forget asking Jeeves or posting on some technical forum.  For older scholars picking up the manual or downloading that e-book PDF you had been considering is also a waste of time and effort. 

If you want to code nowadays, it appears you just “Google it”. 

I am not sure whether this is like
1.       4GL of C# coding
2.       Cheat sheet
3.       Or simply just good common sense to utilise a million experts rather than one book

.......BUT.......

I got the answers I needed pretty quickly and was able to continue with pilot project in earnest.

I have a great mental roadmap for the utility and only time will tell how it matures.  But for all us elder statesman of the Plex community, I'll borrow terminology from one of Ramon Chen’s key marketing phrases from the early days of Obsydian.

“Stop coding, start Googling.”

Thanks for reading. 
Lee.

Monday, July 11, 2011

Data Generation

We’ve had the ‘Ice Age’, ‘Bronze Age’, ‘Iron Age’ and now we are firmly entrenched in the technology era otherwise known as the ’Information Age’. We’ve had generation X and generation Y and even the ‘Generation Game’ with a plethora of celebrity hosts.

I guess I could affectionately refer to ‘now’ as the ‘Data Generation’

This is a subject I have spent quite a bit of time on recently. Everyone knows how hard it is when creating our datasets for testing our application(s). It’s quite okay if we have a mature application or scrambled ‘Production system’ data we can draw upon.

But!

What about those new systems and new modules. Really, do you want to input 200 locations for that new website you have created in manually? Do you want to try and come up with 50 unique customer names?

I bet $100 dollars that everyone has test data like ‘Mr Smith’, ‘Tom and Jerry’ etc who lives in ‘My Street’ or ‘The White House’. Sure, some of us may have written a program or two to create those datasets like ‘Location 001’ to ‘Location 999’. These approaches are all well and good but what about when you want demonstrate your new functionality to the CEO or perhaps to a prospective client if you are a software house.

Sylvester Stallone, 15 Kick Arse Street, Disneyland, BH 90210

is not going to cut the mustard or is

Mr Me, 21a My Street, My Tow, My City, ZP12345.

“Just use the live data”, I hear you say.

Well this would work as the data would survive most (if not all) of your validations, look good and is 100% realistic. But this doesn’t help with your marketing screen prints, online or in-house demonstrations, not to mention if someone presses the ‘send email’ button and Mr Smith from Browns Bay, Auckland, New Zealand gets a letter about his account overdraft or product that hasn’t been ordered.

So there are as many pitfalls with real-world datasets too and I’ve only brushed aside privacy laws and data etiquette.

Then there is application stress testing and performance!

Do you really want to enter 10m records into a database table? I needed to achieve this the other week to test out some SQL. Are you really going to write programs to populate these datasets and have expensive programming resources create your data for you? Well I hope not. Are you going to have multiple people performing this role in your organisation?

• Imagine having a centralised tool that could help you roll out data to multiple databases via direct connections or ODBC.
• Imagine creating realistic datasets from pre-defined lists or ‘Customer lists’
• Imagine being able to put 10m records into a database table in under 5 minutes.

Well this is not Hollywood fantasy or a pipe dream from a developer living on the edge of reason.

It’s reality.

After a quite a few years away from day to day programming I recently had a need to enter test data into a simple table. I wanted to enter 70 records into a table which had a number surrogate key, description, date and amount. This was to test some basic functionality I had written. The 70 records limit was to ensure I had gone beyond the default of 64 records for CA Plex BlockFetch functions.

Using the SQL Server Management Studio and wanting to key in sequential meaningful data it took me the best part of 15 to 20 minutes to the enter records like:-

1, Grid 1 - Description 01, 2001/01/01, 1.11
2, Grid 1 - Description 02, 2001/01/02, 2.22

Etc

I then needed to do this in Grid 2 as I was testing some dual grid code I had written.

1, Grid 2 - Description 01, 2001/01/01, 1.11
2, Grid 2 - Description 02, 2001/01/02, 2.22

So, I copied the data to the second table and manually changed the Grid 1 to Grid 2. Perhaps if I had better MS SQL knowledge (Certainly I intend to improve it) then I might have been able to do this with a relatively simple UPDATE SQL. If act, the answer is yes I now can. However, the point is I was dealing with a new database to me. CA Plex allows us to target many platforms so I could have come unstuck in MS SQL Server as easily as on mySQL via ODBC or any other database including IBM i DB2.

Do I want to become expert on all these DBMS’s. The simple answer is Yes and No.

• Yes. I want to become more familiar with the DBMS, how they perform and to tune application code and support the day to day administration and running of the systems.
• No. I don’t want to manually create data and use SQL scripts, imports etc and know the syntax for many DBMS types just for entering some ‘poxy’ test data.

So, I want to channel my learning to what is important to be.

So, how did I solve my issue?

Well I asked a dear friend of mine who I chat to most days. Mr Google. I entered ‘Data Generators’ and I downloaded 3 or 4 of them that appeared in the first page. Nobody goes to page 2 anymore, do they?

I looked at three products. (Note these are the home pages as I know over time they sometimes rebrand products or the links change and the blog post goes out of date). Suffice to say they all do data generation among other things.


I quickly discarded datanamic’s tool. Compared to the others it just didn’t stack up functionality wise. I’d go as far as saying it wasn’t worth the effort getting my corporate firewall open for it.

http://www.redgate.com/ and http://www.sqledit.com/ were different stories. Both tools are excellent and easy to use. I would say that the redgate’s tool looks more polished and was easier to understand the flow and interface. SQLedit’s tool catered for a larger number of databases natively and many via ODBC. Red-gate’s is for MS SQL Server. If targeting one (MS SQL Server) I’d go for that as they will be bale to tightly integrate with SQL Server versions and as they have a whole host of products they will be kept honest my Microsoft changes.

But!!!!!, I use CA Plex and I needed to target other DBMS also (two currently), MS SQL Server and of course IBM i (DB2400). I am at a 2E shop (originally) and therefore need to reach this platform also. I have also recently worked on mySQL with Plex ODBC so the need to hit these DBMS’s was real and present. Therefore, I purchase the SQLEdit tool.

With both tools I quickly spotted a bug or two or had questions for the support teams about functionality and use. Considering I was on the trial version (not that this should matter) the support and service I received from both companies was absolutely first class. I feel like I can add (Paul/Lynda of Red-gate) and (Igor of SQLEdit) to my Christmas card list (the personal one).

Fixes were created and sent to me within days (by both companies) or in the case of CYYMMDD data generation support for my traditional 2E DTE formats in about 5 hours from Igor and his team. I was simply blown away by the agility of both companies.

The tools are priced within $100 USD of each other for the top end features and the comparable version at $249 USD each makes these tools a steal as does the exchange rate here in NZ 

I will never ever ever ever ever ever ever (just getting my point across) manually create datasets again.

For me these tools have the following benefits and I haven’t even fully explored some of the other functionality within them. i.e. generation of referential integrity sounds datasets.

• Quick large datasets
• Patterns, Imports and rules based
• Cheap than doing it manually
• Can leverage existing data (Scramble – SQLEdit)
• Ability to create a centralise data pool for multiple developers, systems etc

Remember that realistic datasets also help us to identify systems limits (overflowing arrays etc) and performance bottle necks in the development cycle and not post implementation where the cost is significantly higher in terms of ‘budget to fix’ and ‘reputation’.

These tools (or something) similar should be on every developer’s tool belt and if you haven’t got one or are hiring someone who hasn’t. Think again. If you are having software developed for you and your provider doesn’t have these tools in their development methodology, get a new supplier!

Perhaps you should contact me!

Thanks for reading.
Lee.

Post Edit:  I have also just discovered another mainstream generator called Data Maker from http://www.grid-tools.com/.  I have requested pricing which to me means that it'll be outside of the scope of the tools mentioned in this blog and many of our budgets.  If I can get an evaluation and form a meaningful opinion I will post in the future.

Sunday, December 19, 2010

7.0 is waiting in the wings.

Wow.

What a few months we have had in the Plex and 2E world since I last wrote a proper update.

The major piece of news from my perspective is the publication of the candidate features for the 7.0 release of Plex.  Yes, you heard me right!!  After talking up 6.5 in previous posts of mine and at the 2009 conference in Florida, CA have decided to skip to 7.0. 

Usually, an x.0 release number (increment) signifies a major release with some significant features.  Well it looks like 7.0 isn't going to disappoint us.  I personally can't wait to get involved and do some beta on .NET Client (WPF and XBAP) and the WCF Proxy for Cloud deployment.

For the full recorded presentation.  Visit this link.

Interesting times ahead.

Thanks for reading.
Lee.