Power to Build

Home » Posts tagged 'Powerbuilder'

Tag Archives: Powerbuilder

Powerbuilder: Datawindow – what tier?


In continuation of my recent post about PowerBuilder – what tier, I would like to explore a little more on the technical front. In any books or articles on the power of PowerBuilder, you will always confront a Datawindow. A Datawindow is the heart of the GUI development in PowerBuilder.

By design, a datawindow is truly 2-tier. You design a Datawindow by adding a SQL Statement – from the database tier, the 2nd tier. Then you design the GUI of the database to be displayed in the client – tier 1. This is a datawindow. If you follow MVC pattern, Model is there, View is there and the controller is sort of built into the GUI itself (like validation etc). It’s a great innovation of that time.

dw-pic1

Fig 1. How a (2-tier) Datawindow works – couresy: Hyderabad College

To use a datawindow in such an environment, all you need to do is the following:

dw_employee.DataObject = d_employee
dw_employee.SetTransObject(SQLCA)
dw_employee.Retrieve()

You are done! (I left out details of the datawindow object, connection parameters and Retrieveal parameters for simplicity). In this case, you are directly connecting to the database from the client and the data flows directly into the datawindow control. You could actually share the data on a datawindow with another (using ShareData function). This allows you to retrieve once and show the same data in different formats.

Distributed Datawindows/Datastores

Then came the n-tier or the distributed architecture revolution. With Datawindow being at the heart of PowerBuilder design, how would you get the middle tier in? Particularly, when the middle tier being the one that’s supposed to connect to the Database.

Before going full on to middle tier in the form of an Application Server, PowerBuilder floated the idea of the Distributed PB as early as ver 5.0. They introduced the concept of HTML datawindows. They also introduced the idea of a Datastore, a non-GUI datawindow object. (Before datastore, we used to create a hidden datawindow, which still painted the datawindow behind the scene and thus still the overhead). These were steps in the right direction to distribute an application. (I have a gripe though that Datawindow (control) and Datastore don’t share the same object hierarchy!! So, you constantly have to test for the type of the object. How Sybase missed such a basic thing, beats me!).

Then came the Application Server called, EAServer. To be able to retain the datawindow and still be able to follow a MVC pattern, Sybase introduced the distributed Datawindow. Well, it wasn’t exactly an attribute of the data model, but a method in the datawindow object to synchronize a Datawindow (or datastore) object structure and data across the tiers. To do this, they use Serializing/De-serializing a datawindow using couple of new methods – GetFullState and SetFullState. See here for more detail.

3tierig 2. 3-tier architecture – courtesy: Simcrest

With the distributed Datawindow, you create a Datastore in the Application Server (EAServer) and synchronize it with a client side Datawindow (control) with the same structure.

With the Datawindow control being on the client and that you cannot connect to the database directly, above example requires some code adjustment:
Client

dw_employee.DataObject = d_employee
// Get the blob from the Server
Connection.CreateInstance(iuo_employee)
lblb_dwblob =iuo_employee.RetrieveData()
dw_employee.SetFullState(lblb_dwblob) ...

Where iuo_employee is a non-visual object created on the server.

Server

Function iuo_employee.RetrieveData()

...

ds_employee.DataObject = d_employee ds_employee.SetTransObject(SQLCA) ds_employee.Retrieve() lblb_dsblob = ds.employee.GetFullState()

// Somehow send it to the client.

 

Distributed_DatawindowFig 3. Simple implementation of a distributed Datawindow

There you have it. PowerBuilder somehow, did manage to distribute their GUI object called the datawindow. You essentially create a non-visual Datastore that has the same attribute as the Datawindow control. Retrieve it and send the results as a blob to the client to be shared and shown on the Datawindow control.

Only one problem is that you are sharing both the data and the structure of the Datawindow object between client and server. This is slightly in cotrast with how other platforms would implement this. In Java, for example, your JTable still stays on the client, while the TableModel could be made to use a middle tier object to fetch the data. The middle tier object could then connect to the database to get the actual data.

I would have preferred  if they made the Datastore the data/model portion of the Datawindow, instead of making it a new object as a Datawindow minus GUI. This way we could have reused the ShareData function in a distributed capacity in some way and implement the dealings with blob behind the scene. But that’s crying over a spilled milk. EAServer and the associated architecture is going away and thus the idea of distributed datawindow may disappear from PowerBuilder altogether. (However, Appeon may carry the concept forward. See here).

Powerbuilder and Source Code Control (SCC)


I posted about Source Code Control in general earlier. Various source code control software exist, named as Version Control Systems (VCS) or Source Control Systems (SCS) or even Source Code Control System (SCCS)*. If you noticed, these tools works at the file level – the smallest unit that can be handled in any Operating System. Typically, you open your version control system, check out necessary files, work with them and check back in.

*PB refers to the SCCS simply as Source Control System (SCS). In this post we will refer to it as SCS or as SCC tool, as it has to implement Microsoft SCC API.

Source Control in an IDE

If you are working in an IDE environment, this will pose a problem. You edit, compile and test from within IDE, why would you want to go out to the OS just to work with files from version control system and then come back into IDE to complete the development cycle, and go back out to the OS and the SCC tool again to check back the work. (Trust me, I’ve done this. Typically, you may do this with Java or C/C++ files, depending on the IDE/editor you use). To alleviate it, the IDE/Editor may provide toolbar buttons to execute OS commands, so you can execute them from within the IDE/Editor. But, it’s not tightly coupled.

Seriously, wouldn’t it be nice to have the SCC integrated into the IDE? But then again, considering how many different SCC tools exist, IDE builder can’t add support for every possible tool in the IDE. This would be easy if there was only one type of SCC interface. Or, if all the tools follow a certain standard!

Actually, several such standards exist; on Windows it’s pretty much based around Microsoft’s Source Code Control (SCC) API**. By implementing this API as the interface to their own SCS (VCS), vendors can “standardize” their product. Of course, internally they could do totally different things, but as long as they provide standard set of interface functions, they will all look and work the same to system using them. If IDE tools can build their SCC operations around these API, then they will be able to “talk” to the SCS (or VCS) from various vendors easily. Both Visual Basic and PowerBuilder offer such connectivity.

** This is an older Microsoft Standard. See here for more details about newer MS Standards. Since classic PB doesn’t support the newer standards out of the box, we will have to stick with legacy SCC API.

PowerBuilder and Source Control

But, PB poses a further challenge. Where as, Visual Basic saves it’s objects and programs as files in the Operating System level, PB saves it’s objects and the source code in a library file called PBL (pibble). The smallest unit the OS sees in a PB application is a PBL file and not individual object files. So, we cannot really track the changes in a PB application correctly, if we do check-in/check-out at PBL file level. What we need is a way to track at the individual object level, smallest unit of source code entity in PB.

PB offers a plugin approach for the 3rd party SCC tool to work with it. It also provides a wrapper to the SCC API mentioned above, so the Source Control Systems themselves, don’t need to know about PB objects. PB will export and import objects to files before passing the control to SCC plugin software.

In this post, we will see how PowerBuilder can be setup to work with different SCCS tools. Particularly we will explore PB Native, PVCS and later we will explore how PB can be made to work with SVN. Only SCS (or VCS) that implement Microsoft SCC API will work with PB.

Setting up Workspace to use SCC provider (SCS or VCS)

First of all, to enable Source Code Control in a PB application, you need to pick your SCC provider. You do this by right clicking on the Project workspace and choosing Source Control System (SCS) tab. Here you enter the user credentials for connecting to the SCC provider.  Once you connected,  SCC status for every object will show up as a small icon before the object name in the System Explorer. (A + sign marks a new object not in SCC yet, a small circle refers to those that exist in the repository, a check mark refers to object checked out  – either by you or others.

pb_pvcs_connection

Fig 1. Connecting to a SCC Provider at the Workspace level

As shown, there is a drop list to choose the SCC provider to use for the current PowerBuilder project. The SCC wrapper inside PB takes care of translating to and from the SCC tool. My friendly fetcher, Registry Scanner, came to the rescue. I searched for “Merant Version Manager” (another incarnation of good old PVCS).  I found it – it is stored neatly in the registry.

Where is this information stored?

I really wanted to find out where all this information is stored. Workspace file (.PBW) is a logical place to store it. But, it is not stored in this file. The .pbw file seems to be a list of the targets in the project workspace. Then I looked around and found that it was stored in Windows registry.

2013-12-17 15_55_14-RegScanner

Fig 2. List of SCC Providers available to us inside PowerBuilder is stored in Registry

2013-12-17 16_03_18-Registry Editor

Fig 3. A specific SCC Provider chosen at the Workspace level is stored in Registry

When you choose source code control provider in the PB workspace properties, PB actually stores the details in a key named SourceControl.  If this is missing, then that means you didn’t setup SCC in Workspace properties. So, if you have a workspace that has problem with SCC on open, you may never be able to open the workspace. In such case, if you rename or delete the SourceControl key, PB will no longer try to connect the workspace to SCCS.

Notice, PB had replaced all the “\” with $. PB has quirky ways of escape chars. Tilde (~) for example is used as an escape character a lot in PB.

Incidentally, the above image shows other keys like Connection, MRUList, Painters etc. These are the “Recent Objects”, “Recent Connections”  and “Recent Workspaces” listed under File menu.

Object level Tracking in SCCS

Remember, the PB code is now tracked at the individual object level and any SCCS will only know to store files. So, how does PVCS know about objects in PB? It doesn’t!! This is where PB export files come in handy. Every type of object can be exported to a text file with specific extension. (See here for a nice post on exporting PB objects as files). If that link ever breaks down, see here for my take on it.

When you checkout/Get a copy of the file to your working directory, PB (the SCC wrapper) exports a file to the File System. The file is made writable if it’s being checked out. In that case, the object is also locked in PB IDE. Anything that can be exported as a text file will be stored in the SCC repository. So, a PB target (which is a file with .pbt extension) will have SCC status. All PB objects such as windows, applications, Datawindows etc will have SCC status. Right clicking on such objects in the System Explorer will show additional, SCC related, options in the popup menu. A PBL itself cannot be checked into repository, at least not from within PB, so right clicking on a PBL will not include SCC options in the menu. In general, the SCC options will show up, only if the Workspace is setup to use Source Control and the object is “SCC”able.

pb_vcs_context_menu

Fig 4. Objects that can be stored in SCC Repository

What is a PBG file and why do we need?

To keep track SCC status of the objects in a library, PowerBuilder uses a .PBG file*.  Imagine, once all the objects are exported and checked into PVCS, how will you know, which objects goes to which PBL, in case you ever have to recreate the PBL file from souces (Bootstrapping for e.g). This is where the PBG file comes into the picture:

Any components you add to the libraries then will be added to PBG. PBG file is created for each PBL and is a laundry list of the objects in that library for PVCS to use. Anything you add/remove/change will be in the PBG file and this is what is used to sync up with PVCS.

It has sort of a mapping for each object and the library it’s supposed to be in. This file essentially contains a list of all the objects in a PowerBuilder library (PBL) file or conversely, it tells us which PBLs, these objects (really, export files) will need to be loaded into, the moment they come out of SCCS.

When objects are added or deleted in the library and checked into version control, the PBG file is automatically checked out, updated and checked back in, by PowerBuilder.

Not having PBG file in sync with the PBL, may cause unexpected results. For e.g., if you added a new object to PBL (say, while NOT connected to SCCS) and the PBG file wasn’t up-to-date with it, that object will be missing if the PBL is ever bootstrapped from existing object (files) in SCCS and thus could result in compile to runtime errors.

pbg_file_format

 Fig 5. PBG files

* See here for a nice description why PB needs a PBG file.

Troubleshooting

Removing SCC from Workspace having issues

Sometimes the connection between PB IDE and the SCC system doesn’t work. If you have set up to always connect to the SCC system, then PB won’t come up. In such as a case, edit the registry and “turn off” SCC option. After this, PB will open fine again. If you need to connect to SCC, you will have to reconnect to it, as shown in Fig. 1 above.

Keeping PBG file in sync manually
If you are done with checking in new objects for a library/target, you can open the PBG file on (same folder as PBL) in any text editor to check if it’s up-to-date.
If for some reason, your PBG is out of sync with the one in SCC, you can lock it in SCCand check it in directly from the source directory.
Update, 05/30/2017
Noticed that Agent SVN added a link to this post in their description of setting up their SVN client for PowerBuilder. See their page for more details.

Some Useful Links

http://pbdj.sys-con.com/node/42500

PBDJ article that explains it all – PB with SCC.

http://codicesoftware.blogspot.com/2009/05/plastic-scm-powerbuilder-installation.html

The author describes using PowerBuilder with Plastic SCM as SCC tool.

http://anvil-of-time.com/wordpress/powerbuilder/powerbuilder-simple-source-control/

Agent SVN:

http://www.zeusedit.com/agent/ide/pb/classic.html

 

PBGrep – A simple Search Tool for PowerBuilder Code – in Perl


I wrote this script a few years ago. This comes in handy if you have a lot of PowerBuilder export files and would like to search for specific patterns, sort of like grep does. To use this all your PowerBuilder code must be exported to files (.SR? files). (We always keep the source files).

The simplest usage is to provide a pattern using -p option and directory to search in with -d option.

For e.g.,

To look for all the PB source files that have “dropdownlistbox” in them, I use:

pbgrep.pl -p “dropdownlistbox” -d z:

This will result in the below listing. Each entry shows, Source File Name, Line #, Event or function name with the line # within the function/event and finally the line with the pattern we are looking for.

Z:/Core/CLT/ITS_ClientAdHocQry/w_adhoc_results.srw(1020):
selectionchanged;(2):
type ddlb_printlayout from dropdownlistbox within w_adhoc_results

Z:/Core/CLT/ITS_ClientCase/w_cse_caseflow_rule_defn.srw(1205):
selectionchanged;(54):
type ddlb_resulttype from dropdownlistbox within tabpage_resultdetails

Z:/Core/CLT/ITS_ClientMain/w_login.srw(560):
clicked;closewithreturn(w_login,(2):
type ddlb_serverselection from dropdownlistbox within w_login

Z:/Core/CLT/ITS_ClientReporting/w_search_correspondence.srw(1282):
clicked;n_cst_correspondence(2):
type ddlb_printqueue from dropdownlistbox within w_search_correspondenc

Z:/Core/DES/ExpertSystem/Client/ITS_ClientEpsDesigner/w_login.srw(473):
clicked;closewithreturn(w_login,(2):
type ddlb_serverselection from dropdownlistbox within w_login

I could use the -FCSV format, to generate an Excel file. All the entries above for each file, will be collapsed into a single row.

Here is the complete usage from the script:

usage: $0 [-hv] [-d dir] [-p pattern] [-F TAB|CSV|REPORT|LIST*] [-o<filename>|-oa]

-h : this (help) message
-d dir : Directory to search in
-p <pattern>: Pattern to search

-F <format>: Output can be TAB or CSV delimited or LIST or Report (default)
LIST outputs object name and function name where pattern was found.

-o <filename>: Optional, if missing, the output will be sent screen (STDOUT)
-oa : Optional, when present, pattern is used for name of the file (with special chars removed).

example: $0 -v -d Z:\LA -p <pattern>
If <pattern> has spaces or is a regular expression, enclose it in double quotes

Script: http://pastebin.com/kszxR1ar

Here is the full Perl script:

# pbgrep.pl - Grep like utility for a PoweBuilder Application
# Usage: pbgrep.pl ....

# Author: Sam Varadarajan
# Date: 12/01/2009
# Purpose: Script to grep PB export files to find patterns; Each find will show the function name and line #s.

# Modification History:
#
# Copyright, Sam Varadarajan, 2009 - 2014

#
# Globals
#
use vars qw/ %opt /;

init();
my $ptrn = $opt{'p'};
my $dir = '';
my $verbos = 0;
$verbos = 1 if $opt{'v'};
my $file_name = '';

if ($opt{'o'})
{
	$file_name = $opt{'o'};

	#oa for automatic file name
	if (!$file_name || $file_name eq "a")
	{
		$file_name = "search_".$ptrn.".txt" ;
	}
	$file_name =~ s/ /_/g;
	$file_name =~ s/[^a-zA-Z0-9_.\-]*//g; #derive the name from the pattern
}
else {
	$file_name = '-'; #stdout
}

open FILE, ">$file_name"; 

#my $bare = 0;
# see below $bare = 1 if $opt{'b'};

my $format = "REPORT"; # default Report
$format = uc($opt{'F'}) if $opt{'F'};

my $delim = "\t";
if ($format eq "CSV")
{
	$delim = ",";
}
elsif ($format eq "TAB")
{
	$delim = "\t" ;
}
elsif ($format eq "LIST")
{
	$delim = "\t" ;
}

usage() if (!$ptrn);

print $ptrn."\n";

$dir = $opt{'d'} if $opt{'d'};
if (!$dir)
{
	$dir = 'Z:\\';
}

if ($ptrn)
{
		process_dir($dir, \&process_file, $ptrn);
}

close(FILE);
#end of main

sub process_dir {
    my ($path, $code_ref, $ptrn) = @_;
    my ($DH, $file);
	#debug print $path;
    if (-d $path) {
        unless (opendir $DH, $path) {
            warn "Couldn't open $path.\n$!\nSkipping!";
            return;
        }
        while ($file = readdir $DH) {
				#DEBUG print "processing $file\n";
            next if $file eq '.' || $file eq '..';
			#recursion.
            process_dir("$path/$file", $code_ref, $ptrn);
        }
    }
	elsif (-f $path && (grep /\.sr[u|w|f]/, $path))
	{
#DEBUG		print $path." ".$ptrn."\n";
		$code_ref->($path, $ptrn);
	}
}

sub process_file()
{
	my $path = @_[0];
	my $ptrn = @_[1];
	my $proto_done = 0;
	my $curr_func = "";
	my @words = ();
	my $end_proto = 0;
	my $func_line_nbr = 0;
	my $file_line_nbr = 0;
	my @lines = ();
	my $in_proto = 0;
	my $cnt = 0;

	open (SRFILE, $path)
    or die "$0: Cannot open $path: $!\n";
	#DEBUG 	print "debug*** ".$path." ".$ptrn."\n";

	@lines = ();

	# print "processing $path\n";

	# this assumes every file will have forward prototypes
	while (<SRFILE>) {
		$file_line_nbr++;
		next unless ($_ =~ /forward prototypes/);
		$proto_found = 1;
		last;
	}
	if ($proto_found)
	{
		while (<SRFILE>) {
			$file_line_nbr++;
			next unless ($_ =~ /end prototypes/);
			last;
		}
	}

	LOOP1: while (<SRFILE>)
	{
		chomp;
		s/\t+/   /g;
		s/^[ ]*//g;
		$file_line_nbr++;

		@words = split(/\s+/);

#		if (((/public/i || /protected/i || /private/i) &&  /function/i || /subroutine/i) || (/event/i))
# SV, 3/17, a line with 	inv_log.Event ue_closelogfile() caused it to think it was inside an event. Added ^ to event check.
# REDTAG, may be we will have to treat them separately and also, what if there was a function statement or event that
# was commented out?

		if (((@words[0] =~ /public/i || @words[0] =~ /protected/i || @words[0] =~ /private/i) &&
				(@words[1] =~ /function/i || @words[1] =~ /subroutine/i)) ||
			(@words[0] =~ /^event/i   && (@words[1] !~ /Post/i && @words[1] !~ /Trigger/i)))
		{
			if (@words[0] =~ /^event/i)
			{
				$curr_func = @words[1];
			}
			else {
				$curr_func = @words[3];
			}
#			print "found func header@words[3]\n";
			$func_line_nbr = 0;
			next LOOP1;
		}
		if ($_ =~ /^[ |\t]*end function/i || $_ =~ /^[ |\t]*end event/i)
		{
			if ($cnt && ($format eq "LIST"))
			{
					$path =~ s/sr.[ ]*$//g;
					print FILE $path.$curr_func."\n";
			} else {
				if (@lines)
				{
					my $count = scalar(@lines);
					foreach (my $i=0; $i < $count; $i++)
					{
						my $li = $lines[$i];
						#DEBUG print "*****".{$lines[i]}."\n";
						if ($format eq "REPORT")
						{
							if ($ptrn eq $curr_func)
							{
								print FILE "<<Function defition>>\n";
							}
							print FILE $path."(".$li->{'FILE_LINE'}."):\n   ".$curr_func."(".$li->{'FUNC_LINE'}."):\n      ".$li->{'LINE'}."\n\n";
						}
						else {
							print FILE $path.$delim.$curr_func.$delim.$li->{'FILE_LINE'}.$delim.$li->{'FUNC_LINE'}.$delim.$li->{'LINE'}."\n";
						}
					}
					#VERBOSE print "****# of items found = ".scalar(@lines)."\n\n";
				}
				else {
					# VERBOSE print $path."\t".$curr_func." No match found*******\n\n";
				}
			}
			$curr_func = "";
			$end_proto = 0;
			$func_line_nbr = 0;
			@lines = ();
			$cnt = 0;
			next LOOP1;
		}
		# all other lines
		$func_line_nbr++;
		if ($_ =~ /$ptrn/ig) {
			$cnt++;
			my $line_info = {};
			#if (!$bare)
			if ($format ne "LIST")
			{
				# bare format - list of functions (obj.func)
				$line_info->{'FILE_LINE'} = $file_line_nbr;
				$line_info->{'FUNC_LINE'} = $func_line_nbr;
				$line_info->{'LINE'} = $_;
			}
			push(@lines, $line_info);
		}
		next LOOP1;
	}
	close(SRFILE);
}

#
# Command line options processing
#
sub init()
{
	use Getopt::Std;
	my $opt_string = 'hvbd:p:F:o:';
	getopts( "$opt_string", \%opt ) or usage();
#	use Getopt::Long;
#	GetOptions("h"=>\$opt{'h'},
#		"v"=>\$opt{'v'},
#		"b"=>\$opt{'b'},
#		"optional:o"=>\$opt{'o'},
#		"d"=> \$opt{'d'},
#		"p"=> \$opt{'p'},
#		"F"=> \$opt{'F'}
#	); 	

	usage() if $opt{'h'};
}

#
# Message about this program and how to use it
#
sub usage()
{
  print STDERR << "EOF";

	This program does...

	usage: $0 [-hv] [-d dir] [-p pattern] [-F TAB|CSV|REPORT|LIST*] [-o<filename>|-oa]

	-h        	: 	this (help) message
	-d dir    	: 	Directory to search in
	-p <pattern>: 	Pattern to search

	-F <format>: 	Output can be TAB or CSV delimited or LIST or Report (default)
						LIST outputs object name and function name where pattern was found.

	-o <filename>: Optional, if missing, the output will be sent screen (STDOUT)
	-oa 			:	Optional, when present, pattern is used for name of the file (with special chars removed).

	example: $0 -v -d Z:\LA -p <pattern>
	If <pattern> has spaces or is a regular expression, enclose it in double quotes
EOF
	exit;
}

Gotcha: PB Deployment – SYSTEM var not supported error


When you work on code deployed to n-tier, you have to be little more careful while defining the interface for the objects. An n-tier application typically runs in an heterogeneous environment, meaning objects in different languages/platforms may talk to each other through a common interface. If you are not careful, your object may end up being passed to the wrong tier that doesn’t know how to interpret it. You have to make sure the data types of the interface functions exist on all sides. Otherwise, you will get unexpected errors at compile or run time.

This is true for PB also, if you are deploying your code to EA Server. Mixing system objects and types in the object’s interface will cause errors while deploying to EA Server as shown below.

1. CORBA IDL Error on PB System objects

1. CORBA IDL Error on PB System objects

While deploying this project (p_d_adm_roles_lh) to EA Server, PB throws the following error message:


 ---------- Deploy: Deploy of p_d_its_adm_roles_lh
Doing Incremental Rebuild...
Generating IDL for Selected Components...
Generation Messages:
    Deployment Warning: SYSTEM Variables Not Supported.   The following bypassed for component 'im_adm_roles_impl':    n_txn
    Deployment Warning: SYSTEM Variables Not Supported.   The following bypassed for component 'im_adm_roles_impl':    n_txn
 ---------- Finished Deploy of p_d_its_adm_roles_lh

Here is what Sybase document says:

Deployment error or warning (from PowerBuilder): SYSTEM variables not supported

Public instance variables and arguments to public functions can be any of:      Standard datatypes, Structures, Custom class user objects that have been deployed as EAServer components, ResultSets.

If you are using system datatypes (transaction, data store, and so on) as instance variables, declare them as protected or private. If you are using system datatypes as function arguments, declare the function as protected or private.
The real problem was in the CORBA interface. I declared a public instance variable called n_txn of type Transaction which is known to PB but not to CORBA IDL  (interface definition language). So, this cannot be in the public interface of the object. CORBA IDL only allows types that can be mapped to it’s own. User defined objects are allowed, as long as they themselves are composed of simple types that IDL allows. More on these in a separate post.
While deploying a component to EA Server, PB generates IDL and related proxy and stubs/skeletons for the CORBA interface. It checks to make sure generated IDL complies with the rules of IDL definitions. In the above example since PB’s Transaction object cannot be mapped to any IDL type, PB prevents deployment and throws the error. Thus avoiding surprises at runtime.
Ideal solution is to avoid non-standard types in the instance/shared variables in objects to be deployed to n-tier. But, if you really have to define instance/shared variables of types that are not translatable to IDL, then make them private or protected. IDL interface is generated only for the public attributes and methods.

References:
http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.help.eas_5.2.eastg/html/eastg/CDDDGIFJ.htm

http://www.omg.org/gettingstarted/omg_idl.htm#TypeSystem
http://www.ibm.com/developerworks/webservices/library/co-corbajct3.html
http://documentation.progress.com/output/Iona/orbix/gen3/33/html/orbix33java_pguide/IDL.html
http://www-cdfonline.fnal.gov/daq/CORBAXXX/tutorial.html
http://docs.oracle.com/cd/E13203_01/tuxedo/tux80/cref/member.htm#1120423
http://www-itec.uni-klu.ac.at/~harald/corba/idldatattypes.html
http://www.slideshare.net/arnoldindia/corba-idl

PB Gotcha – Embedded (inline) SQL – SELECT..INTO


Just a quick gotcha I stumbled on in PB code. This is about SELECT..INTO. As you know, SELECT..INTO expects one row and it works well as long as there is only one row.While researching another issue, I saw the below SQL in one of the objects in our application.

of_get_cutoffdate

….
 //
Date ld_CutOff
….
SELECT TO_DATE ( PARM_VALUE , ‘DD-MON’ )
INTO :ld_Cutoff
FROM SYSPARM
WHERE TRIM ( PARM_NAME ) =’CUTOFF_DATE’;
….

There are 4 rows for this parameter in the SYSPARM table. I originally anticipated ld_Cutoff to be NULL (or 01/01/1900 – default for date) as I thought this SQL will error out because of too many rows. To my surprise, the variable had the value from the first row.

The gotcha here is that PB does not throw any exception when a DB error occurs. Even though the above SQL resulted in an error, PB silently fetches the first row’s value into the variable and sets SQLCODE to -1 and error message in SQLErrText. (See attachment). 

Even though the host variable got value from 1st row, this is not guaranteed and is not a documented feature in PB or Oracle. If we anticipate more than 1 row for the SELECT, then try to add conditions (rownum, cursors) to filter to 1 row.

(Oracle SQL[PL/SQL] behavior: In Oracle the same SQL would generate a TOO_MANY_ROWS exception . There also the SQL seems to be setting the host variable with the value from the 1st row, but Oracle does not recommend relying on this. See usage notes in reference 1 below)

Irrespective of the # of rows fetched,  we always need to add proper error checks around the SQL. In PB, check the SQLCA.SQLCode, and print messages using SQLCA.SQLErrText. (In Oracle, handle TOO_MANY_ROWS exception).

Note: A lot of our SQLs getting SYSPARM are done this way. PR4297 was created to add corrections to these.

Reference:
1. Oracle PL/SQL User Guide

Sybase EAServer: Taking Control – TomCat + EA Server


Today another coworker of mine was trying to get XAMPP setup work with our EA Server. (We have PB application and Java objects deployed to this EA Server). He has a complete setup of XAMPP with Apache and Tomcat. He has a JSP page that uses the PB objects. This got to a point and crashed with “class not found” error. Clearly pointing to a classpath issue. But which one?

He mentioned that he had to set the EA Server first before he started XAMPP. Clearly there are some conflicts in the classes shared between the 2. EA Server is normally started using a batch file that reads the env classpath (windows settings in control panel) and adds its own classpaths locally. (SET LOCAL inside batch file). So whatever EA Server set is not visible to XAMPP but not the other way around. That’s why he had to start EA Server first.

We then looked for the specific class that was “not found”. This happened to a stub for the PB objects. In Jaguar manager (aka Sybase Central, EA Server admin) there is an option to generate stubs (and skeletons if needed). This typically points to SYBASEEASERVERHTMLCLASSES directory. We found out that he also similar classes in SYBASEEASERVERJAVACLASSES directory. We knew these were not used, as they were not in the class path. Then we searched for classes or jar files with similar names. We stumbled on a jar file that contains part of these stubs, inside XAMPP. (He uses OpenBD for coldfusion applications, which is why he has XAMPP in the first place). This jar was old and had part of the stubs classes and did not have the one we were having trouble with.

Apparently Tomcat was automatically loading this jar (Anything inside the lib directory gets loaded automatically). We found that when we tried to rename the file.

Now that we found out the classes in trouble, we thought we could just add the class paths needed to the environment classpath and hope Tomcat would pick it up. This did not happen. On the web, I saw a note about how Tomcat handles classpaths differently than regular Java. There was a suggestion to the classpath to the batch file that starts Tomcat. We didn’t try that yet.

For now, we decided to copy the classes to the library folder, so Tomcat can automatically load the classes. So, we shut down XAMPP, regenerated the jar file (in EA Server stub generation, there is an option to generate a jar instead of class files) and copied to XAMPP lib folder and everything worked OK.

Script: PB backup – Using PB ORCA to export PBLs


A basic Operating system level unit in Powerbuilder is a library file (.PBL). Each PBL file is a self contained directory in itself, with the objects stored inside in a proprietary binary format. You need Powerbuilder to view and work with this directory structure (System tree, Library painter etc).

Sometimes, it would be nice to be able to work with individual objects as text files at the OS level. This will enable us to use any text utilities to work with the objects as files (like you would a Java or C++ file). For e.g., I have a Perl script to grep through the exported text files. With text files, you can store away these source files in any version control systems (PVCS, SVN etc) easily. And you can easily diff 2 versions of an object by using any text diff tool (Of course, there are tools available to do these from inside PB, but it’s just lot easier to work text files). PB allows the source code to be exported to text files with extensions, *.sr? (.srw for windows, .srd for datawindows etc.).

PB ORCA is an API that Powersoft developed early on for 3rd Party CASE tool vendors to work with PB objects. With PB ORCA, you can write programs to work PB libraries and objects from OS command line. Several object level operations available in PB IDE are also available in this API. Incidentally, there is a command line utility available with the same name (pborca.exe) on Google Code. The tool has several commands that correspond to API functions. You can create a ORCA script file (with .orc extension) and run it in batch mode using the tool. See here for a complete list of commands supported by the tool.

Below script uses pborca.exe in a DOS batch file to export all the objects in all the PBLs in a project directory to a corresponding directory structure. The batch file first loops on a list of .pbl files in a directory (passed in), and creates an ORCA script (latax.orc) with export commands. Then it runs the generated ORCA script using pborca.exe to export the object files (.*sr?). When the script finishes, you will have directory with several sub-directories, one for each PBL file exported.

This script uses several advanced features (%~ modifiers, ENABLEDELAYEDEXPANSION) of DOS Batch file. DOS scripts do several things a Unix script does, but it’s almost like an after thought! More on these later.

I use this script for daily backup of my development environment. Often times only a handful of files change on the developer’s PC. Once I export the PBLs to a directory, I can diff 2 different versions using a diff tool. (I use Winmerge). The files can be optionally zipped up as well.

Pborca.exe can also handle the reverse (known as bootstrapping). Exported PB object files are only useful for performing file level operations. To be useful in a PB application, they will have to be imported back in to a PBL file again which can then be opened inside Powerbuilder. As of version 9, PB ORCA can be used to bootstrap as well. See here for the idea.

REM 12,2009 Sam V
REM script to export PBLS using pborca.exe (ORCA interface to PB)
REM this copies all the folders from source (typically Z:) to target.
REM then it creates an ORCA script to export each pbl to the subdirectories.
REM Finally it runs the generated orca script in pborca
REM @echo off

echo %0 %1, %2
@REM copy all the directories from the powerbuilder root to the export directory
@REM if a x.PBL is already in x subdirectory, you may not need MKD1 below.

@echo creating the target directory; just copy the entire path with empty sub-directories
xcopy /t /e /I %1 %2

@REM Here we start generating the orca file to be used with <a href="http://pborca.googlecode.com/svn/trunk/install/deploy/pborca.htm">PBORCA</a>.
@echo session begin pborc100.dll > latax.orc

@echo off
@SETLOCAL ENABLEDELAYEDEXPANSION
@FOR /f "usebackq delims=" %%a IN (`dir /s /b %1*.pbl`) DO (
@set a1=%%~dpa
@REM echo a1=%a1%
@call set "a1=!a1:%~1=%~2!"
@REM MKD1
@REM This creates a sub-directory with the same name as PBL file
mkdir "!a1!"
@echo export %%~fa,,,!a1! >> latax.orc)

@echo session end >> latax.orc

@ENDLOCAL

@echo on
@echo run the script in pborca
pborca latax.orc

REM end of script