Friday, May 31, 2013

How to install DAR into Documentum repository

Documentum artifacts (jobs, methods, modules, lifecycles, custom types, etc.) are created in a Composer project and built into a DAR file, which consequently must be installed into the repository using DARDeployer (in previous versions called DARInstaller) or Composer Headless (using scripts).
Note: the Composer project artifacts can also be installed into the repository directly from Composer, by choosing Install Documentum Project... item in the context menu (right click on project): specify repository name, user name, password, domain (optional), press login, then install.

To install a DAR file, open DarInstaller (or DARDeployer in old versions) and fill the fields:
 - DAR: the DAR file to install (required)
 - Input File: the installation parameters file (file with custom installation parameters)
 - Locale Folder: Folder with localization
 - Log File: log file location (in case you don't want the default location)

Once filled these fields, select the docbroker host, enter a port number (default: 1489), press Connect to get the repositories list. Then select a repository from the list, enter user name (superuser), password and domain (optional). When ready, click the Install button. The installation process starts and once it's completed you will see a message telling the dar was installed successfully. In case of errors, you will be notified with a proper message (no changes take place as the DAR installation is done in a transaction). In this case check the logs and try to solve the issue (sometimes it might be a trivial connection lost with the CS), then reinstall the DAR file. Sometimes, before reinstalling the DAR, Data Dictionary re-publish could be necessary (using dm_DataDictionaryPublisher job or just with API command: publish_dd,c).

If you can't use DARDeployer (for ex. non-Windows OS: Linux, AIX, etc.), you can install the DAR using Composer Headless. For this, you must create 3 files:

1. darinstall (script file)
Here's a template of the script file (Linux):
#!/bin/sh

export JAVA_HOME=/opt/documentum/shared/java/1.6.0_17
export ECLIPSE=/opt/documentum/product/6.7/install/composer/ComposerHeadless
export WORKSPACE=/opt/project/darinstall-workspace
$JAVA_HOME/bin/java -Xms512m -Xmx1024m -cp $ECLIPSE/startup.jar org.eclipse.core.launcher.Main -clean -data $WORKSPACE -application org.eclipse.ant.core.antRunner -buildfile darinstall.xml


2. darinstall.xml (Ant buildfile)
<?xml version="1.0"?>
<project name="darinstall" default="install">
  <description>ant script to install a DAR file</description>
    <property file="darinstall.properties"/>
    <property name="dar.folder" value="." />
    <target name="install">
      <antcall target="_install.dar">
        <param name="dar.file" value="${dar.folder}/CustomTypes.dar" />
      </antcall>
      <!-- other dars, in order of dependency -->
    </target>
    <target name="_install.dar">
      <echo message="============================"/>
      <echo message="Installing DAR ${dar.file}"/>
      <echo message="============================"/>
      <emc.install dar="${dar.file}" docbase="${dctm.docbase}" username="${dctm.username}" password="${dctm.password}"/>
    </target>
</project>


3. darinstall.properties (configurations,credentials)
dctm.docbase=project_dev
dctm.username=dmadmin
dctm.password=dmadmin
dar.folder=/opt/project/darinstall


Now just launch the first script (darinstall), wait until it completes, then check installation logs. If you find no errors/warnings, it's done!

Friday, May 10, 2013

Group memberships updates reflected in delay

In a multiple Content Servers Documentum environment you can encounter the following issue: after you change a group memberships (add/delete users and groups) using one CS, the changes are not reflected when you connect to the repository with other Content Servers, for a period of time (up to several hours). So the Content Server that performs the memberships change reflects the changes immediately, while the others will have a delay, usually a big one.
Why is this happening?
That's because the caching mechanism the Content Servers are using. The group memberships are cached from the DB into Content Server memory, so the CS does not query the DB every time it needs this information. When you are connected with a CS and a group is updated, this CS takes care to update its cache accordingly for that group. But what if you have more CS for the same repository? They will not update their cache, because they don't know there was a change. That's why on the other CS the group change will not be reflected until they don't refresh their cache.
How to solve that?
There is a configuration in server.ini which the CS Installation Guide does not pay attention to: upd_last_chg_time_from_db . The CS Administration Guide provides the following description for the key:
Optional key. Specifies that all Content Servers in a clustered environment have timely access to all changes in group membership. Valid values are: * TRUE: Should only be used for environments with multiple Content Servers. The value should be set to TRUE for all running Content Servers. * FALSE: The default value.
So the solution is to modify the server.ini files on all Content Severs setting this key value to TRUE:
# A boolean flag specifying whether user entries and groups list are synchronized from db.
upd_last_chg_time_from_db=T

Wednesday, May 8, 2013

How to test BOF (TBO/SBO) code changes without re-deployment

Documentum Business Object Framework (BOF) version 2.0 implies deploying of all modules implementation into the repository, as jar artifacts. These jars are downloaded and cached on demand, when the corresponding module is being called.
The need of re-deployment of BOF modules (TBOs or SBOs) each time we make a change in the code might cause important delays in the development process: recompiling composer project, installing it, restarting JMS each time consumes a lot of time.
Is there a way to test the BOF code fast, without redeploying it? Yes, it is!
You can cheat the DFC client and replace the cached jars with a newer versions. Here's how you can do it.
After having called the BOF module (to ensure it's cached), go to the BOF cache folder: its location on the client machine can be specified in dfc.properties file, in the dfc.cache.dir property (default is cache subdirectory of dfc.data.dir). Inside you'll find a folder structure like this: [DFC_VERSION]/bof/[DOCBASE_NAME]. In the docbase folder you'll find more jars having the name [DMC_JAR_OBJECT_ID].jar. You have to locate the one containing your BOF code. For this, you can run the DQL command:
select r_object_id from dmc_jar where object_name='[JAR NAME]'
Or, you could also change the property dfc.bof.cache.append_name property so that the cached jars names will contain also their object names (on next caching).

Once you locate your jar, you have 2 options:
a) replace the cached jar with the new one, built in your IDE, keeping original name
b) open the jar (like a common archive) and replace only the modified class(es)
Remember to stop your DFC client to unlock the cached jars, otherwise you can't modify them.
Now start your DFC client and test your changes.

Code fast, test faster!

Tuesday, May 7, 2013

Custom job method DFC code sample

Implementing a custom Documentum job method requires writing some non trivial DFC code. Below you'll find a sample using an abstract class which can be extended and re-used for any implementation of a job method.
Connection/helping methods are encapsulated into the AbstractJobMethod abstract class, so you can write custom classes that extend it. The only method to be implemented is execute(), where you have an active session (handled by the abstract class) and can perform the required actions against the repository.
The abstract class uses the CustomJobArguments, which is an extension of standard DfStandardJobArguments, providing some additional methods to access the passed job arguments.
Check How to create a custom job in Documentum for details on creation of custom job and method.

public class SomeCustomMethod extends AbstractJobMethod {
  public int execute() throws Exception {
    DfLogger.debug(this, "Started custom method implementation with user: " + session.getLoginUserName(), null, null);
    // custom method logic goes here
    return 0;
  }
}
////
public abstract class AbstractJobMethod implements IDfMethod, IDfModule {
// for BOF 1.0 the class must implement only the IDfMethod interface, the jar is deployed on JMS
// for BOF 2.0 it implements also IDfModule because it will be deployed into the repository and will run as a BOF module

  protected CustomJobArguments jobArguments;
  protected IDfTime startDate;
  protected IReportWriter reportWriter;
  protected IDfSession session;
  protected IDfSessionManager sessionManager;
  protected IDfClientX clientx;
  protected IDfClient client;

  public int execute(Map args, PrintWriter writer) throws Exception {
    setup(args, writer);
    try {
      int retCode = execute();
      printJobStatusReport(retCode);
      return retCode;
    } catch (Exception e) {
      DfLogger.error(this, "", null, e);
      reportWriter.emitLine("Error encountered: " + e.getMessage());
      reportWriter.closeOut(false);
      throw e;
    } finally {
      if (reportWriter != null)
      reportWriter.close();
      if (session != null)
        sessionManager.release(session);
    }
  }

  // the only method to be implemented in concrete subclasses
  public abstract int execute() throws Exception;

  public void setup(Map args, PrintWriter writer) throws DfMethodArgumentException, DfException, Exception {
    startDate = new DfTime();
    jobArguments = new CustomJobArguments(new DfMethodArgumentManager(args));
    setupMethodFactory(writer);
    String username = jobArguments.getUserName();
    String password = jobArguments.getString("password");
    String domain = jobArguments.getString("domain");
    String docbase = jobArguments.getDocbaseName();
    setupSessionManager(username, password, domain, docbase);
  }

  private void setupSessionManager(String username, String password, String domain, String docbase) throws DfServiceException, DfException {
    DfLogger.debug(this, String.format("setupSessionManager-> username[%s] password[%s] domain[%s] docbase[%s]", username, password, domain, docbase), null, null);
    clientx = new DfClientX();
    client = clientx.getLocalClient();
    sessionManager = client.newSessionManager();
    IDfLoginInfo loginInfoObj = clientx.getLoginInfo();
    loginInfoObj.setUser(username);
    if (password != null && !password.equals(""))
        loginInfoObj.setPassword(password);
    loginInfoObj.setPassword(password);
    loginInfoObj.setDomain(domain);
    sessionManager.setIdentity(docbase, loginInfoObj);
    session = sessionManager.getSession(docbase);
  }

  private void setupMethodFactory(PrintWriter writer) throws Exception {
    try {
      IDfId jobId = jobArguments.getJobId();
      ReportFactory reportfactory = new ReportFactory();
      if (writer != null) {
        DfLogger.debug(this, "uso reportFactory", null, null);
        reportWriter = reportfactory.getReport(jobArguments.getDocbaseName(), jobArguments.getUserName(), "", jobArguments.getMethodTraceLevel(), jobId, writer);
      } else {
        DfLogger.warn(this, "writer == null, uso SimpleReportWriter", null, null);
        reportWriter = new SimpleReportWriter();
      }
    } catch (Exception e) {
      DfLogger.error(this, "Failed to create report writer. Error: " + e.getMessage(), null, e);
      throw e;
    }
  }

  private void printJobStatusReport(int retCode) throws Exception {
    reportWriter.emitLineToReport("Return Code-> " + retCode);
    String jobStatus = null;
    IDfTime end_date = new DfTime();
    long min_duration = Utility.timeDiff(startDate, end_date) / 60L;
    if (retCode == 0)
      jobStatus = "Custom Job completed at " + end_date.asString("yyyy/mm/dd hh:mi:ss") + ". Total duration: " + min_duration + " minutes.";
    else if (retCode > 0)
      jobStatus = "Custom job completed with Warnings at " + end_date.asString("yyyy/mm/dd hh:mi:ss") + ". Total duration: " + min_duration + " minutes.";
    else
      jobStatus = "Custom job completed with Errors at " + end_date.asString("yyyy/mm/dd hh:mi:ss") + ". Total duration: " + min_duration + " minutes. Check job report for details.";
    updateJobStatus(jobStatus, jobArguments.getJobId());
    reportWriter.closeOut(retCode >= 0);
  }

  public void updateJobStatus(String sJobStatus, IDfId jobId) throws Exception {
    if (session == null) {
      DfLogger.error(this, "setJobStatus: (session==null)", null, null);
      throw new NullPointerException("setJobStatus: (session==null)");
    }
    try {
      IDfPersistentObject job = session.getObject(jobId);
      if (job == null)
        throw new DfException("Failed to retrieve dm_job object from id '" + jobId.getId() + "'.");
      job.setString("a_current_status", sJobStatus);
      job.save();
    } catch (Exception e) {
      throw e;
    }
  }
}
////
public class CustomJobArguments extends DfStandardJobArguments {

  protected IDfMethodArgumentManager methodArgumentManager;
 
  public CustomJobArguments(IDfMethodArgumentManager manager) throws DfMethodArgumentException {
    super(manager);
    methodArgumentManager=manager;
  }

  public String getString(String paramName) throws DfMethodArgumentException {
    return methodArgumentManager.getString(paramName);
  }

  public int getInt(String paramName) throws DfMethodArgumentException {
    return methodArgumentManager.getInt(paramName).intValue();
  }
}

How to create a custom job in Documentum

Documentum jobs allow to process some objects automatically, by schedule, according to the business logic. They are pretty similar to scheduled tasks on Operating Systems. When started, the job calls a Documentum method which executes the desired logic. The methods are executed on the Java Method Server (JMS), which is part of the Content Server installation.
A job consists of several items:
 1) Job Method implementation code
 1.1) for BOF 2.0: corresponding implementation jar(s) and module
 2) dm_method object - which holds information about the method, implementation class, etc.
 3) dm_job object - which holds information about job, its schedule, last and next invocation, current status, the method it runs, etc.

Here's how to create all of the required items for the job.

1. Job method implementation code
First of all you have to write the code that will implement the logic the job method will perform. There are 2 approaches of deploying this code: BOF 1.0 or 2.0.
With BOF 1.0 the method classes are deployed manually on the JMS (path: [JBOSS]\server\DctmServer_MethodServer\deploy\ServerApps.ear\DmMethods.war\WEB-INF\lib), while BOF 2.0 requires you to deploy the implementation jars into the repository and associate them with a BOF module.
For both approaches you can find job method implementation classes samples here: Custom job method DFC code sample
When you're done with coding, compile the implementation classes and build the jar.

1.1 Implementation jar(s) and module (for BOF 2.0)
Open Composer and create the follosing artifacts:
a) Jar: the artifact name should be equal to the jar name (to avoid confusion), the jar type is Implementation.
b) Module: the module name must be equal to the full class name (with packages), modul type: Standard Module, choose the implementation jar created at previous step (a), select the class name (equal to module name). In addition specify all the required modules and Java libraries used by your implementation class(es).

2. dm_method object
a) Using Composer
In Composer create a Method artifact, enter the method name, select type: java, Command: module name (equal to implementation full class name), Run Controls: Run Synchronously, check the options: Run as the server, Trace launch, Use method server, Launch directly. If your code is taking considerable time to run, increase the timeout values.

b) Using Documentum Administrator
Open DA and go to Job Management->Methods node level. From menubar select File->New->Method, enter required values and save the method.

c) Using DQL:
create dm_method object
set object_name = 'MyCustomMethod',
set launch_async = false, set launch_direct = true, set method_type = 'java',
set method_verb = 'com.documentum.project.method.SomeCustomMethod',
set run_as_server = true, set timeout_default = 600,
set timeout_min = 300, set timeout_max = 3000, set trace_launch = true,
set use_method_content = false, set use_method_server = true;


3. dm_job object
a) Using Composer
In Composer create a Job artifact, enter job name, Subject, select the Method (created at step 2), set schedule options. In Arguments section you can add custom arguments passed to your method implementation class. Keep in mind that even if you use custom arguments, you should also pass the standard arguments (docbase name, installation owner, job id, etc.) which are used to manage the job. Thus, in order to add custom arguments select Custom Arguments, add desired arguments with values, then switch back to Standard Arguments. Save the job artifact.

b) Using Documentum Administrator
Open DA and go to Job Management->Jobs node level. From menubar select File->New->Job, enter required values and save the job.

c) Using DQL:
create dm_job object
set object_name = 'MyCustomJob',
set title = 'Title', set subject = 'Job description',
set pass_standard_arguments = true, set is_inactive = false,
set start_date = DATE('01/01/2013'), 

set expiration_date = DATE('01/01/2020'), 
set a_next_invocation = DATE('02/01/2013'), set max_iterations = 0,
set method_name = 'MyCustomMethod', set run_mode = 3,
set run_interval = 1, set inactivate_after_failure = true;


Now you have all the required items for you job. Build the composer project, install it into the repository (from Composer, right click the project and choose Install Documentum Project... or take the built DAR from bin-dir folder and install it using DarDeployer/DarInstaller). If you're using BOF 1.0, deploy manually the implementation jar on JMS (usualy in [JBOSS]\server\DctmServer_MethodServer\deploy\ServerApps.ear\DmMethods.war\WEB-INF\lib).
If it's not the first deployment of the DAR, it's better to restart the JMS and clear the BOF cache (mandatory in case of BOF 1.0).

It's done! Go on with testing: run the new job and check the job report and JMS logs.