The concise reference to the
Ublu
Midrange and Mainframe
Life Cycle Extension Language

An extensible object-disoriented interpretive language for midrange and mainframe remote system programming
NASA JPL Candle in Space Copyright © 2019, 2022
Jack. J. Woehr
jwoehr@softwoehr.com
All Rights Reserved

SoftWoehr LLC
http://www.softwoehr.com
PO Box 82
Beulah, CO 81023-0082
USA

Author: Jack J. Woehr
Original date: 2013-07-11
Last edit: 2022-02-22

 Copyright © 2015, Absolute Performance, Inc.
  http://www.absolute-performance.com
 Copyright © 2016, 2022, Jack J. Woehr
  jwoehr@softwoehr.com http://www.softwoehr.com
  All rights reserved.

 Redistribution and use in source and binary forms, with or without
 modification, are permitted provided that the following conditions are met:

* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.

* Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 

Table of Contents

Overview

Ublu is a tool for ad-hoc process automation primarily aimed at IBM i ®.

The Java command line java -jar ublu.jar [args ...] invokes an extensible object-disoriented interpretive language named Ublu intended for interaction between any Java platform (Java 9 and above) on the one hand, and IBM i ® 7.1 and above) and/or IBM z/VM ® SMAPI operations programming on the other. The language is multithreaded and offers a client-server mode. It also possesses a built-in single-step debugger.

"Any Java platform (Java 9 and above)" includes running natively on IBM i. When executing Ublu interpretively on IBM i, a character set including the @ sign must be set, e.g., CCSID 37.

Other open source software is used to provide other facilities, see the usage command and license files in the root directory of the distribution.

"Object-disoriented" means that Ublu is a procedural language with function definitions which is implemented on top of object-oriented Java libraries. Ublu manipulates objects constructed in the underlying environment without forcing the user to know very much about the object architecture. By way of analogy, consider how resources in a hierarchical network environment are flattened out for ease of use by LDAP. In a similar fashion, Ublu conceals many details while still providing complete access should a program require it, e.g., via the calljava command.

Ublu operates in four (4) modes:

  1. as a command-line utility
  2. as an interpreter
  3. as a text file program compiler/executor via the include command
  4. As a TCP/IP port listening server

In any mode, Ublu reads a line and then consumes one or more commands and their arguments and attempts to execute them. In certain cases, text parsing crosses line boundaries, e.g., when a quoted string or execution block extends for multiple lines.

Ublu provides the command FUNC to define functions with argument lists which can later be invoked with appropriate arguments.

Commands and command features are steadily being added to the repertoire of Ublu. Suggestions for added commands or extensions to extant commands are welcome and should be made via whatever ticketing system is present on the site where Ublu is distributed.

Installation

Each Ublu release is distributed as an archive ublu-dist.zip containing

We recommend the following layout:

/opt/ublu/ublu.jar
/opt/ublu/examples
/opt/ublu/extensions
... etc.
Then define a function for your login shell: or use the script bin/ublu .

For more on installation, see the Ublu Guide : A Typical Installation

Android Installation

See instructions in the Ublu Guide.

Building Ublu from source

It can be a good idea to download the source archive for a release or check out the current development version via from https://github.com/jwoehr/ublu.git

When you have the source tree, change directory to the top level and choose a Maven target (typically, mvn package) or execute make clean dist as there is a Makefile in the base directory of the source to invoke Maven for you. The output will be in the target directory.

Invocation

Invocation directly from Java

You can invoke Ublu using the java command on your system.

Ublu invocation: java [ java options .. ] -jar ublu.jar [ ublu options .. ] [ ublu commands .. ]
Ublu options:
-i filename [-i filename ..] include all indicated source files
-s if including, include silently, otherwise startup interpreter silently
-t [filename, --] open history file filename or default if --
-h display help and then exit
-v display version info and then exit
-w [properties_file_path] [ ublu commands .. ] starts Ublu in a GUI window which reads from the properties file path provided processing no other options, interprets any commands provided then finally waits for more input or menu selections.

Note: SSL support for connections to the server requires a slightly different invocation and is discussed here.

Ublu Invocation Script (bash/ksh)

bin/ublu is a bash/ksh invocation script. ublu --help will tell you what you need to know about invoking Ublu in this manner.

./ublu --help
Ublu is free open source software with NO WARRANTY and NO GUARANTEE, including as regards fitness for any application.
See the file LICENSE you should have received with the Ublu distribution.
This bash/ksh shell script ./ublu starts Ublu if Ublu is installed in a standard fashion in /opt/ublu/
If Ublu is installed elsewhere, use the -u ubluclasspath switch to point this script to the Ublu components.
Usage: [CLASSPATH=whatever:...] ./ublu [-Xopt ...] [-Dprop=val ...] [-u ubluclasspath] [-w [propertiesfile]] [--] [arg arg ..]
where:
        -h | --help     display this help message and exit 0
        -X xOpt pass a -X option to the JVM (can be used multiple times)
        -D some.property="some value"   pass a property to the JVM (can be used multiple times)
        -u ubluclasspath change Ublu's own classpath (default /opt/ublu/ublu.jar)
        -w propertiesfile       launch Ublu in windowing mode with properties from propertiesfile
                ... If -w is present, it must be last option.
                ... If present, must be followed by a properties file path or no more arguments accepted.
                ... A default share/ubluwin.properties is included with the Ublu distribution.
        --      ends option processing
                ... Must be used if next following Ublu argument starts with dash (-) 
        [arg arg ...]   commands to Ublu
If there is an extant CLASSPATH, the classpath for Ublu is that path postpended to Ublu's classpath.
Exit code is the result of execution, or 0 for -h | --help, or 2 if there is an error in processing options.
On options error, this usage message is issued to stderr instead of to stdout.
Copyright (C) 2018 Jack J. Woehr https://github.com/jwoehr/ublu jwoehr@softwoehr.com
  

Note that these options are slightly different from the options used when invoking directly from Java.

Ublu Startup

Ublu 1.2.2 build of 2020-03-28 08:17:37
Author: Jack J. Woehr.
Copyright 2015, Absolute Performance, Inc., http://www.absolute-performance.com
Copyright 2017, Jack J. Woehr, http://www.softwoehr.com
All Rights Reserved
Ublu is Open Source Software under the BSD 2-clause license.
THERE IS NO WARRANTY and NO GUARANTEE OF CORRECTNESS NOR APPLICABILITY.
***
Running under Java 1.8.0_152
Ublu utilizes the following open source projects:
IBM Toolbox for Java:
Open Source Software, JTOpen 9.6, codebase 5770-SS1 V7R3M0.00 built=20181007 @X2
Supports JDBC version 4.0
Toolbox driver version 11.6
---
Postgresql JDBC Driver - JDBC 4.1 42.1.4.jre7 (42.2)
Copyright (c) 1997, PostgreSQL Global Development Group
All rights reserved http://www.postgresql.org
---
tn5250j http://tn5250j.sourceforge.net/
NO WARRANTY (GPL) see the file tn5250_LICENSE
---
SBLIM CIM Client for Java HEAD 2017-01-28 18:34:31
http://sblim.cvs.sourceforge.net/viewvc/sblim/jsr48-client/
Copyright (C) IBM Corp. 2005, 2014
Eclipse Public License https://opensource.org/licenses/eclipse-1.0.php
---
PigIron 0.9.7+ http://pigiron.sourceforge.net
Copyright (c) 2008-2016 Jack J. Woehr, PO Box 51, Golden CO 80402 USA
All Rights Reserved
---
org.json
Copyright (c) 2002 JSON.org
***
Type help for help. Type license for license. Type bye to exit.
>
To exit the system, type bye . Some minimal cleanup will be performed.

[Ctrl-D] is effectively the same as bye .

If you must force exit, type exit . No cleanup beyond that provided by Java itself will be performed.

[Ctrl-C] is effectively the same as exit .

Memory requirements

Ublu ordinarily runs well with default Java memory values. However, in performing database operations on large databases one may be forced to boost the heap allocation considerably on invocation, e.g.
java -Xms4g -Xmx4g -jar /opt/ublu/ublu.jar ...
which allocates 4 gigabytes.

Interpreter

As an interpreter, one or more commands with their arguments can be issued on a line. The commands will be executed in sequence. If any command fails, generally the interpreter will abandon the rest of the commands on the line and return to the prompt. The interpreter mode prompt is a right-arrow ( > ) but will have a number to the left of it (e.g., 1>) if in a nested interpreter sublevel.

Nested interpreter sublevel

The interpret command can be used to enter a nested interpreter sublevel, an interpreter nested within the previous level of interpreter.

State is inherited by a nested interpreter sublevel from the previous level of interpreter.

Some state persists in the previous level of interpreter when the nested interpreter sublevel ends and some is lost.

When the sublevel ends (via bye or Ctl-D),

Main interpreter

The instance of the interpreter presented to the user upon program invocation is called the main interpreter to distinguish it from nested interpreter sublevels and from (possibly multiple) interpreter instances created in server mode.

Goublu

The Ublu interpreter by default depends upon the Java console. Java console support is minimal on most platforms. A better front end console for Ublu called Goublu has been coded in the Go language and can be downloaded and built if you have the Go language installed for your platform.

Server mode

Ublu can launch multiple TCP/IP port servers that accept connections and bind them to individual interpreter sessions. This allows remote applications such as web applications to execute Ublu commands and receive their output. The default listener port is 43860. See the server command.

The server command can also be used to interpret a single execution block for each connection and then disconnect at the end of interpreting the block. This allows the user to connect to a "canned program" instead of gaining access to the full interpreter.

Commandline

Invoked as a command line application with or more commands following the invocation on the shell command line, the commands and their arguments are processed as in the interpreter, then the application exits. If one of the commands is interpret, then the application continues to run until that interpreter exits when that interpreter encounters the bye or exit command. exit always exits, but bye only unnests one interpreter level. If that interpreter is the main interpreter, Ublu exits.

Input

Input for the main interpreter can come from the console or from standard input when Ublu is invoked using some form of shell redirection, e.g., shell pipes ( | ) or "here documents" ( << ).

Output

In interpreter and commandline mode, command output of Ublu is written to standard out, with the exception of the following which are written to standard error:

When automating system tasks, it may be helpful to redirect stderr from the shell command line via 2>/dev/null to discard miscellaneous interpreter informational output. This, however, discards error messages as well.

In server mode, Ublu standard out is connected the network socket. However, Ublu standard error output remains attached to the invoking interpreter's standard error. This makes the main interpreter a monitor of errors occurring in server threads.

Executing commands from a file

Interpretation of a text file of commands and functions is performed via the command include.

Launch your application from a shell script autogenerated via gensh

Ublu offers a command gensh which autogenerates a shell script to process arguments to your custom function via command line switches and invoke your custom function with those arguments. This is the Ublu model for runtime program delivery.

Debugger

The dbug command provides interactive single-step debugging of an execution block.

Parsing and syntax

Parsing and syntax are simplistic.

Input is parsed left-to-right, with no lookback. Each sequence of non-whitespace characters separated from other non-whitespace by at least one whitespace character is parsed as an element. No extra whitespace is preserved in parsing, not even within quoted strings (with the important exception that a quoted string is always returned with a blank space as the last character).

Note that the simplistic parser imposes one particularly arbitrary limitation in that within an execution block neither the block opener $[ nor the block closer ]$ are allowed to appear inside a quoted string. If you need to have those symbols in a quoted string, the limitation is easy to get around, as follows:

put -to @foo ${ $[ }$
put -to @bar ${ $] }$
FOR @i in @foo $[ put -from @i ]$
$[
FOR @i in @bar $[ put -from @i ]$
$]

Quoted strings

Quoting of strings is achieved by placing the string between two elements ${ and }$ . The open and close elements of a quoted string must each be separated by at least one space from the contents of the string and from any leading or following commands or arguments.

The string command offers many manipulations of strings to get what you want.

Whitespace between non-whitespace elements of the string is compacted into single spaces  ${ This     is       a quoted     string. }$ represents This is a quoted string. A quoted string is always returned by the parser with a blank space at the end of it.

A quoted string can span multiple lines. When end-of-line is reached in the interpreter after the open-quote glyph ( ${ ) without finding the close-quote glyph ( }$ ), then string parsing continues and the prompt changes for any lines following until the close-quote glyph is encountered. The string parsing continuation prompt is the open-quote glyph surrounded in parentheses (${) .

Note that the simplistic parser imposes one particularly arbitrary limitation in that within an execution block neither the block opener $[ nor the block closer ]$ are allowed to appear inside a quoted string. If you need to have those symbols in a quoted string, the limitation is easy to get around, as follows:

put -to @foo ${ $[ }$
put -to @bar ${ $] }$
FOR @i in @foo $[ put -from @i ]$
$[
FOR @i in @bar $[ put -from @i ]$
$]

Tuple variables

A tuple variable is a autoinstancing name-value pair that is either globally accessible within the given interpreter session or made local to a function definition via the LOCAL command. Tuples are always referenced with a "@" as the first character of their name, e.g., @session.

You can direct any kind of object to a tuple with the -to @tuplename dash-command (adjunct to commands which support that) and retrieve the value with -from @tuplename (adjunct to commands which support that).

You can assign an object from one to another tuple or from the tuple stack to the stack or to a tuple with tuple -assign feature.

Tuples can be directly manipulated via the put and tuple commands.

Tuple naming

Tuple variables must be named with one (1) @ character followed by letters, numbers, and underscores in any combination, length, and order. You might see in the debugger a tuple name of the form @///19 which denotes a temporary variable used in function argument binding. Currently protection from creating illegal tuple names is not enforced by the interpreter. Unpredictable results can occur if you use illegal tuple variable names.

Default Tuples

The Ublu interpreter starts with certain tuples already defined, as shown in the chart below:

name value notes
@true true can be used for conditionals, etc.
@false false can be used for conditionals, etc.

These defaults are not constant and can be overwritten at runtime via normal tuple manipulation.

Tuple autoinstancing

A tuple springs into being in the global tuple map when a previously unused tuple name is used as a destination datasink or as a parameter (input or output) to a function. When autoinstanced in this fashion, the new tuple's value is null. Alternatively, you can create a tuple variable using the tuple command, e.g. tuple -true @foo will set @foo to true if it exists already, or will create @foo and set it to true if it does not yet exist.

Tuple stack

The system maintains a Last In, First Out (LIFO) stack of tuple variables for programming convenience manipulated via the lifo command.

A command and its dash-commands that expect tuple arguments can also take the argument ~ ("tilde") which signifies "pop the tuple from the tuple stack". An error will result if the stack is empty.

When a command references a source or destination datasink via the -to or -from dash-commands, that datasink may also be ~ ("tilde"), meaning the source or destination is the tuple stack. Non-tuples, e.g, strings put to a destination datasink via -to ~ in this fashion are automatically wrapped in an anonymous tuple. Hence the following session:
> put -to ~ ${ but not a clever test }$
> put -to ~ ${ this is a test }$
> string -to ~ -cat ~ ~
> put ~
this is a test but not a clever test

Autonomic Tuple Variables

If the interpreter encounters a tuple variable or the tuple stack pop symbol ~ (tilde) when it is expecting a command, it checks the value of that variable. If the value is of a class in the list of autonomes, that is, classes whose instances are generally passed as the argument to the eponymous dash-command of a specific Ublu command, that command is invoked with tuple as the eponymous argument, along with any dash-commands and/or arguments which follow. If the variable is not autonomic, the interpreter reports an error.

Thus, the third, fourth and sixth commands of the following example using ifs to get the size of a file in the Integrated File System are equivalent:

> as400 -to @mysys MYSYS.com myid mYpAssWoRd
> ifs -to @f -as400 @mysys -file /home/myid/.profile
> ifs -- @f -size
56
> @f -size
56
> lifo -push @f
> ~ -size
56

Autonomic tuple variables offer a useful object-disoriented shorthand which, along with tuple stack wizardry, one should consider avoiding in larger programs in the interest of clarity.

Numbers

Numbers are signed integers and generally can be input as decimal, hex (0x00), or octal (000). See also the num command.

Plainwords

Any whitespace-delimited sequence of non-whitespace characters provided as an argument to a command or dash-command  or function is a plainword. A plainword can be used in most cases to represent a number or a single whitespace-delimited textual item where a tuple variable or quoted string would have to be used to contain longer whitespace-including strings of text.

Constants

Constants are created via the const command. Constants have a string value. The name of a constant has the form *somename and can be used as the argument to a command or dash-command where the syntax notation represent the argument as ~@{something} and only in such position. Constants are not expanded within quoted strings. Constants cannot be used as the argument to a -from or -to dash-command. Plainwords resembling constants, i.e., starting with an asterisk * are not mistaken for constants if they have not been defined as such.

Constants defined in an interpreter level appear in its nested interpreter sublevels. However, constants defined in nested interpreter sublevels do not persist into the previous interpreter level.

Execution blocks

An execution block is a body of commands enclosed between the block opener $[ and the block closer ]$ . Execution blocks are used in functor declarations (callable routines) via the FUN command and with conditional control flow commands such as FOR and IF - THEN - ELSE to express the limit of a code phrase in a condition.

IF @varname THEN $[ command command .. ]$ ELSE $[ command command ]$

is a generalized example of block usage.

An execution block can contain local variable declarations and their use. A local variable declaration hides identically named variables from the global context and from any enclosing block. Inner blocks to the declaring block have access to the locals in enclosing blocks, unless, of course, an identically named variable has been declared local to the enclosed block.

Note that the "comment-to-end-of-line" command # should not ever be used in an execution block! An execution block is treated as one command line, so the comment command will devour the rest of the block.

Note that the simplistic parser imposes one particularly arbitrary limitation in that within an execution block neither the block opener $[ nor the block closer ]$ are allowed to appear inside a quoted string. If you need to have those symbols in a quoted string, the limitation is easy to get around, as follows:

put -to @foo ${ $[ }$
put -to @bar ${ $] }$
FOR @i in @foo $[ put -from @i ]$
$[
FOR @i in @bar $[ put -from @i ]$
$]

An execution block may span several lines, however, the opening bracket ( $[ ) of the block must appear on the same line with and directly after the conditional control flow command operating upon it.

Execution blocks may be nested.

In the Tips and Tricks section of this document is an example which will get a list of active interactive jobs and search that list for specific jobs.

Local variables

An execution block can have local tuple variables declared via the LOCAL command whose names hide variables of the same name which may exist outside the execution block. Locals disappear at the end of the block in which they are declared.

Local variables can be used safely even when a global tuple variable coincidentally of the same name is passed in as a function argument; no collision results, and both the local and the function argument can be referenced.

Example

FUNC foo ( a ) $[
    LOCAL @a
    put -to @a ${ inner @a }$
    put -n -s ${ outer a: }$ put -from @@a
    put -n -s ${ local a: }$ put -from @a
]$
put -to @a ${ outer @a }$
foo ( @a )
outer a:  outer @a
local a:  inner @a

Functors

A functors is an anonymous execution block created via FUN which can then be stored in tuple variable and invoked via CALL and/or associated with a name entry in the function dictionary via defun. Arguments can be passed to the functor. Arguments are call-by-reference; the resolution of these arguments is discussed under Function Parameter Binding.

Functions

A function is a functor associated with a name entry in the function dictionary, usually via FUNC but also via the combination of FUN and defun.

The function dictionary is searched after the list of built-in commands. Dictionaries can be listed, saved, restored and merged via the dict command. Arguments can be passed to the block. All arguments are passed by reference, i.e., passing a tuple variable to a function argument list passes the tuple itself, not the tuple's value, and any alteration of the argument alters the tuple referred to in the argument list. The resolution of these arguments is discussed under Function Parameter Binding.

> FUNC yadda ( a ) $[ FOR @word in @@a $[ put -n -s -from @word put ${ yadda-yadda ... }$ ]$ ]$
> put -to @words ${ this that t'other }$
> yadda ( @words )
this yadda-yadda ...
that yadda-yadda ...
t'other yadda-yadda ...
> dict -list
# yadda ublu.util.Functor@1d4b0e9
FUNC yadda ( a ) $[ FOR @word in @@a $[ put -n -s -from @word put ${ yadda-yadda ... }$ ]$  ]$

> dict -save -to mydict
> FUNC -delete yadda
> dict -list

> dict -restore -from mydict
> dict -list
# yadda ublu.util.Functor@14fd510
FUNC yadda ( a ) $[ FOR @word in @@a $[ put -n -s -from @word put ${ yadda-yadda ... }$ ]$  ]$

See also the FUN and defun commands.

Function Parameter Binding

Ublu's interpreter being purely a text interpreter, performing (almost) no tokenization during interpretation and compilation, function parameter binding is effected by runtime rewriting ("token pasting") of argument references (e.g., @@some_arg) in the execution block to the actual positional parameter value provided at the time the function is called.

Arguments passed to a function or functor can be

Arguments other than blocks or quoted strings are handled as follows:

Thus, in the case of a function

foo ( a b c ) $[ put @@a put @@b put @@c ]$
called with arguments
foo ( @bar woof @zotz )
is effectively seen at invocation by the interpreter as
$[ put @bar put woof put @zotz ]$

The temporary alias actually pasted for tuple variable arguments to functions can be seen in the debugger.

It is perfectly acceptable to name function parameters with the same names as command or functions. But this practice can detract from the readability of the code, especially if using the syntax coloring edit modes provided with Ublu.

Function Dictionary

Function definitions are stored in the function dictionary.

Interpreter instances launched by the interpret command inherit the current function dictionary. Any additions within an interpreter instance are lost when the instance exits back to its parent interpreter instance.

You can view the current function dictionary or save it to a file or tuple variable and later restore it or merge it with the current dictionary. See the dict and savesys commands.

User ID and password

You supply a user ID and a password in the argument list for any Ublu command which accesses an AS/400 (iSeries, System/i) host, or for creating an instance of the host using the as400 command so that you can employ the -as400 dash-command with subsequent commands in lieu of constant repetition of the system name, userid and password. On system operations, if the user ID or password is incorrect, you will be prompted to enter the correct user ID and/or password, upon completion of which the command will proceed.
> joblist testsys frrd oopswrong
Please enter a valid userid for testsys: fred
Please enter a valid password for testsys (will not echo):
000000/QSYS/SCPF
000736/QSYS/QSYSARB
000737/QSYS/QSYSARB2
... etc.
The behavior of Ublu when a signon attempt fails can be modified. See the as400 and props commands for details.

DB400 database operations behave differently with regard to an incorrect userid or password. Unlike the JTOpen systems operation code, the JTOpen JDBC driver does not provide a programmable exit for application code to handle an incorrect userid or password, and instead handles the exception itself by attempting to launch a 1990's-style Java AWT window prompting for userid and password. If your environment supports a GUI, all is well: you can supply the correct userid and password. On the other hand, if your environment does not support a GUI, then the operation fails and a confusing exception is thrown complaining about the absence of a windowing system. You can avoid this windowing behavior and just allow the operation to fail on incorrect userid/password with an understandable exception by adding the following connection property dash-command to the string of dash-commands for the db command:
-property prompt false
which adds ;prompt=false to the URL for the JDBC connection and disables the windowing password prompt.

Commands

Commands are the verbs of Ublu. Some have only language meaning, but the most important commands operate directly upon a host system affecting its data and operation. Be sure you understand what you are doing when you use an Ublu command!

Access to IBM i hosts is provided through IBM's open source JTOpen library.

Access to z/VM SMAPI is provided through the author's open source PigIron library.

Command Structure

Commands are conceptually structured in three parts

  1. command
  2. dash-commands
  3. arguments

Not all commands have dash-commands. Not all commands take arguments. Usually they take one or the other. The general order of the three parts is as follows:

command [dash-command dash-command-argument [dash-command-argument dash-command-argument ...] ] command-argument [command-argument ...]

Where dash-commands are give in square brackets, e.g., [-foo ~@{bazz}] the dash-command is optional and not required.

In this documentation, square brackets and ellipses are used to describe the command structure. Those square brackets and ellipses are not part of the syntax of Ublu, merely documentation notation. See the examples given in the documentation and in the examples directory in the distribution.

In this documentation, where multiple dash-commands are enclosed collectively in a outer pair of square brackets and individually enclosed in square brackets and the bracketed dash-commands separated by the .OR. bar ( | ), e.g., then the dash-commands are a set of mutually exclusive optional dash-commands, e.g., [[-foo ~@{bazz}] | [-arf ~@{woof}]]

Where square brackets are missing from a dash-command description, the dash-command, or one of the alternative of a series of dash-commands separated in the description by the .OR. bar ( | ) is required.

Some dash-commands are required in some contexts, and not in others. Such cases are explained in the explanatory text for the command.

Command

A command is a one-word command name. It is the first element of any Ublu command invocation.

Dash Command

A dash-command is a modifier to the command, itself often possessing an argument or string of arguments.

All dash-commands with their arguments must appear on the same line with the command, except that a quoted string or block argument to a command or dash-command, once started, may span line breaks, thus extending a command over two or more lines.

If dash-commands specify conflicting operations, the last dash-command encountered in command processing is the operative choice.

Often one dash-command is actually the default operation for the command, so that if no dash-command is provided, this default provides the operation of the command anyway. These defaults are noted in the command descriptions.

Eponymous dash-command ( -- )

Many commands are used to create objects of various kinds and store them in tuple variables. Later, these same command operate on these same objects which they themselves have created. This sort of command references the tuple containing such an object via an eponymous dash-command, e.g, job -job @some_job etc. This eponymous dash-command can generally be replaced by -- instead, so the example just given could equally be written job -- @some_job etc.

Order of Dash Commands

Often the order in which dash-commands appear on a line does not matter, but sometimes it does. To be safe, dash commands should generally follow the command in this order:

  1. the eponymous dash-command or the one representing the object being operated upon
  2. the two data sink dash-commands, -from and -to , if used
  3. any other dash-commands
There are exceptions to this ordering, e.g, the eponymous dash-command must come last for the dbug command.

Argument

An argument is the object or, for multiple arguments, list of objects necessary for command execution.

Commands may have arguments, and their dash-commands may also have their own arguments.

All arguments to a command or dash-command must appear on the same line as the command or dash-command and cannot span a line-break, except that a quoted string once started may span line breaks, thus extending a command over two or more lines.

In command descriptions:

Command Example

An example of a command with dash-commands and arguments is the following:

job -job @j -to @subsys -get subsystem
Note that the above example could equally have been written:
job -- @j -to @subsys -get subsystem
using the eponymous dash-command instead of -job.

Datasinks

Command descriptions reference datasinks. A datasink is a data source or a data destination.

Many commands offer the dash-command -to which directs the output of the command (often an object) to the specified datasink. Some commands offer the -from dash-command which assigns a source datasink for input during the command, e.g., include which reads and interprets source code can have its input from a file or variable.

A datasink is currently one of these types:
  1. Standard input and output
  2. Error output
  3. File
  4. Tuple variable
  5. Tuple stack (pushing and popping named or anonymous tuples)
  6. Null output (discard all data directed to this datasink).

A datasink's type is recognizable from its name.

In the absence of the -to dash-command, the default destination datasink of a commands is STD: (standard out).

When a command results in an object other than a string and the command's destination datasink is File or Standard or Error output, Ublu intelligently renders the object as a string.

If the object is of a class which Ublu does not recognize, the object's toString() method is called to provide the data.

System, Userid, Password and -as400

In order to access the iSeries (AS400) server, many commands routinely require in their argument string the following three items:

  1. system (name or IP address)
  2. userid
  3. password
All such commands allow these three arguments to be omitted if instead the -as400 dash-command is used to supply an extant server instance to the command. See the as400 command to learn how create a server instance to be used and re-used.

Of course, the execution of commands that require extended ownership, access control or privilege level on the target system can only be executed via an account with such privileges.
Deprecation of providing system/userid/password as arguments to most commands
Note: Many of the oldest Ublu commands allow system/userid/password to be supplied as main command arguments as well as allowing the user to provide an as400 object via the -as400 dash-command. The older style of command is deprecated and all code should use the -as400 style of providing an object created by the as400 command rather than providing credentials as arguments to most commands.

Commands by Category

Ublu Language
Server and Host Interaction
Utilities
Java / JTOpen
ask
BREAK
bye
CALL
collection
commandcall
const
dbug
defun
dict
DO
ELSE
eval
exit
FOR
FUN
FUNC
help or usage
histlog
history or h
IF
include
interpret
interpreter LOCAL
lifo
list
map
props
put
savesys
server
sleep
string
SWITCH
system
TASK
test
thread
THEN
THROW
TRY
tuple
WHILE
!
# or #!
as400
db
cs
cim

cimi
dq
dta
file
ftp
host
ifs
job
joblist
joblog
jrnl
monitor
msg
msgq
objdesc
objlist
outq
ppl
printer
programcall
record
rs
savf
savef

session
smapi
sock

splfol
spoolf
spoolflist
streamf
subsys
sysval
tn5250
user
userlist
watson
desktop
dpoint
gensh
json
license
calljava
jmx
jvm
num
trace

List of commands

Here is the list of commands Ublu understands and their descriptions. Some descriptions below start with a leading-slash (/) and a number, indicating the number of arguments expected (excluding dash-commands). The slash/numbers are not part of the command syntax, and only serve as documentation: do not enter them yourself. A plus-sign (+) after the number indicates the number represents a minimum number of arguments rather than an absolute number. Optional dash-commands are indicated in [square brackets]. Again, the brackets are documentation, and not part of the actual syntax, just enter the dash-command and its argument(s), if any.

Command summary

Most commands provide a brief summary via help -cmd commandname. That summary is repeated in the command descriptions below. The (neither rigorous nor always entirely accurate) schematic meaning of the summary is as follows:

commandname /numargs[?] [[-dash-command [arg arg ...]] [-dash-command [arg arg ...]] ...] [[-mutually-exclusive-dash-command [arg arg ...]] | [mutually-exclusive-dash-command [arg arg ...]] ...] argument argument [optional-argument] ... : description of command's action

If the first dash-command documented for the command is the -as400 dash-command, then by use of the as400 command to store a system instance in a tuple and the subsequent use of the -as400 dash-command, such a command can omit the three (3) arguments system userid password leaving the rest of the formal argument list the same as it was.

as400

/3? [-to @var] [--,-as400,-from ~@var] [-usessl] [-ssl ~@tf] [-nodefault] [-new,-instance | -alive | -alivesvc ~@{[CENTRAL|COMMAND|DATABASE|DATAQUEUE|FILE|PRINT|RECORDACCESS|SIGNON]} | -connectsvc ~@{[CENTRAL|COMMAND|DATABASE|DATAQUEUE|FILE|PRINT|RECORDACCESS|SIGNON]} | -connectedsvc ~@{[CENTRAL|COMMAND|DATABASE|DATAQUEUE|FILE|PRINT|RECORDACCESS|SIGNON]} | -connected | -disconnect | -disconnectsvc ~@{[CENTRAL|COMMAND|DATABASE|DATAQUEUE|FILE|PRINT|RECORDACCESS|SIGNON]} | -ping sysname ~@{[ALL|CENTRAL|COMMAND|DATABASE|DATAQUEUE|FILE|PRINT|RECORDACCESS|SIGNON]} | -local | -validate | -qsvcport ~@{[CENTRAL|COMMAND|DATABASE|DATAQUEUE|FILE|PRINT|RECORDACCESS|SIGNON]} | -svcport ~@{[CENTRAL|COMMAND|DATABASE|DATAQUEUE|FILE|PRINT|RECORDACCESS|SIGNON]} ~@portnum | -setaspgrp -@{aspgrp} ~@{curlib} ~@{liblist} | -svcportdefault | -proxy ~@{server[:portnum]} | -sockets ~@tf | -netsockets ~@tf | -vrm ] ~@{system} ~@{user} ~@{password} : instance, connect to, query connection, or disconnect from an as400 system

Creates an object instance representing an AS400 system and manipulates that object. The instance is intended to be stored in a tuple variable for later use with the -as400 dash-command to many commands so that the system, userid and password need not be repeated with various commands.

Note on SSL: Connection to the system can be encrypted via SSL via the -usessl or -ssl dash-commands documented below.

When an as400 instance is no longer in use but your program continues, it is often best to -disconnect or -disconnectsvc.

An as400 instance stored in a tuple variable @myas400 is passed to the as400 command using any one of the three dash-command forms:

This is a bit redundant but the three forms persist for historical reasons.

The operations of the dash-commands are as shown:

In the absence of dash-commands the default operation is -instance.

When a signon fails for bad userid or incorrect password, a signon handler is invoked. The JTOpen class library which provides connection to the host has a builtin handler, which opens a GUI window to prompt the user if it can and fails with a mysterious error message if GUI capabilities are not present, e.g., in an ssh session which does not pass XWindows thru. There are also two custom handlers in Ublu, a custom handler which prompts the user textually, and a null handler which simply fails the login operation. This behavior can be controlled on a per-interpreter basis via the props command:

The default is CUSTOM ... users will be prompted in text mode in case of a signon failure.

In the absence of the explicit switches -usessl and -ssl, the as400 command will create SSL (Secure Socket Layer) secure instances if the correct property is set via the props command prior to instancing.

The default is NONE.

Example

as400 -to @myas400 mysystem myuserid mypasswd # an instance of the system is now stored in the tuple variable @myas400
joblist -as400 @a # a joblist is now fetched from mysystem on behalf of myuserid.
joblist -as400 mysystem myuserid mypasswd # fetches the joblist in the same fashion as the previous command
as400 -- @myas400 -setaspgrp *CURUSR *SYSVAL *CURUSR # sets to defaults, could be new values, liblist could be long string
as400 -- @myas400 -disconnect

ask

/0 [-to datasink] [-from datasink] [-nocons] [-say ~@{prompt string}] : get input from user

ask prompts the user (if a prompt string is provided) and puts the response from the user, who must press [enter] after entering reponse text.

If Ublu is windowing, ask puts up a requester dialog instead of prompting in the text area.

If the user simply presses [enter] or [OK] without inputing any text, a zero-length string is put. In windowing mode, if the user presses [Cancel], null is put.

If the -from dash-command is set, the prompt comes from

If -nocons is set, Ublu will not attempt to read the console but instead read the standard input. This has no effect in windowing mode.

Example

> ask -say ${ What is funny? }$
What is funny? : elephants and giraffes and zoos
elephants and giraffes and zoos
> ask -to @answer -say ${ Do you have anything to say? }$
Do you have anything to say? :
> string -len @answer
0

BREAK

/0 : exit from innermost enclosing DO|FOR|WHILE block

Example
The following will fetch a joblist of active interactive jobs and look for MARSHA as a user in the job list, BREAKing when found.

as400 -to @as400 mysystem myuid ********
put ${ Looking for MARSHA in a list of all active interactive jobs }$
joblist -to @joblist -jobtype INTERACTIVE -active -as400 @as400
FOR @j in @joblist $[
    job -job @j -get user -to @user
    put -to @marsha ${ MARSHA }$
    test -to @match -eq @user @marsha
    IF @match THEN $[
        put -n ${ We found Marsha! }$
        job -job @j -info
        BREAK

    ]$ ELSE $[
    put -n ${ nope }$
    ]$
]$

bye

terminates the current interpreter level immediately, ending processing and discarding any following commands at that level. It takes no arguments. At the top level, bye exits Ublu. More cleanup is performed by leaving Ublu via bye than with exit.

Example

> interpret
1> interpret
2> bye foo bar woof
1> bye
> bye
Goodbye!
$

CALL

/? ~@tuple ( [@parm] .. ) : Call a functor
The functor to be called was created by FUN and is stored in ~@tuple. When the functor was defined, a list of parameter names was provided. The substitution list of tuple names to substitute for the parameter names when encountered in the functor's execution block surrounded in parentheses and separated by at least one space but all on the same line follows.

Example

FUN -to @fun ( a b c ) $[ put -from @@a put -from @@b put -from @@c ]$
put -from @fun
ublu.util.Functor@5f40727a ( a b c ) $[ put -from @@a put -from @@b put -from @@c  ]$
put -to @aleph ${ this is a }$

put -to @beth ${ and here is b }$
put -to @cinzano ${ la dee dad }$
CALL @fun ( @aleph @beth @cinzano )
this is a
and here is b
la dee dad
FUN -to @fun ( ) $[ put ${ zero param functor }$ ]$
CALL @fun ( )
zero param functor
put -from @fun
ublu.util.Functor@67ba0609 ( ) $[ put ${ zero param functor }$ ]$

See also defun FUN FUNC

calljava

/0 [-to @datasink] -forname ~@{classname} | -class ~@{classname} [-field ~@{fieldName} | -method ~@{methodname} [-arg ~@argobj [-arg ..]] [-primarg ~@argobj [-primarg ..]] [-castarg ~@argobj ~@{classname} [-castarg ..]] | -new ~@{classname} [-arg ~@argobj [-arg ..]] [-primarg ~@argobj [-primarg ..]] [-castarg ~@argobj ~@{classname} [-castarg ..]] | --,-obj ~@object [field ~@{fieldName} | -method ~@{methodname} [-arg ~@argobj [-arg ..]] [-primarg ~@argobj [-primarg ..]] : call Java methods and fields

The calljava command invokes a method or constructor in the underlying Java virtual machine or accesses a class or object field. The object result is put to the assigned datasink. If the method has a void return type, nothing is put.

The dash-commands describe the desired Java call or field. If -new appears, the call is to a constructor.

Note: The Ublu-coded extensions to Ublu in the extensions subdirectory provide many examples of the use of calljava.

See also: num

Examples

> put -to @obj ${ this is a test }$
> calljava -to @result -obj @obj -method length
> put -from @result
15
> num -to @num -int 12
> calljava -to @result -obj @obj -method substring -primarg @num
> put -from @result
st

The following example (examples/clHelp.ublu) stacks parameters and builds a String array of args in order to invoke the main(args []) method of a utility class present in JTOpen.

# clHelp.ublu
# Example from Ublu Midrange and Mainframe Life Cycle Extension language
# https://github.com/jwoehr/ublu
# Copyright (C) 2016 Jack J. Woehr http://www.softwoehr.com
# See the Ublu license (BSD-2 open source)

# clHelp ( lib cmd s u p ) 
# Generate HTML help for a CL command
# args are library commandname system user password
# example: clHelp ( QSYS WRKUSRPRF MY400 MYID ******** )
# a file is generated in the current directory QSYS_WRKUSRPRF.html
FUNC clHelp ( lib cmd s u p ) $[
    LOCAL @start LOCAL @end
    num -to ~ -int 10
    calljava -to ~ -forname java.lang.String
    calljava -to @L[String -class java.lang.reflect.Array -method newInstance -arg ~ -primarg ~ 
    string -to ~ -trim @@lib
    string -to ~ -trim ${ -l }$
    string -to ~ -trim @@cmd
    string -to ~ -trim ${ -c }$
    string -to ~ -trim @@s
    string -to ~ -trim ${ -s }$
    string -to ~ -trim @@u
    string -to ~ -trim ${ -u }$
    string -to ~ -trim @@p
    string -to ~ -trim ${ -p }$
    put -to @start 0
    put -to @end 10
    DO @start @end $[
        calljava  -class java.lang.reflect.Array -method set -arg @L[String -primarg @start -arg ~
    ]$
    calljava -class com.ibm.as400.util.CommandHelpRetriever -method main -arg @L[String
]$
  
# browseClHelp ( lib cmd s u p ) 
# Generate HTML help for a CL command and open default browser on it.
# args are library commandname system user password
# example: clHelp ( QSYS WRKUSRPRF MY400 MYID ******** )
# a file is generated in the current directory QSYS_WRKUSRPRF.html and it
# is loaded into the user's default browser.
FUNC browseClHelp ( lib cmd s u p ) $[
    LOCAL @filename
    put -n -to ~ ${ file:// }$
    ~ -to ~ -trim
    system -to ~ ${ pwd }$
    \\ ${ system returns an ublu.util.SystemHelper.ProcessClosure object.
    This calljava gets the output to concatenate to for the absolute URI. }$
    calljava -to ~ -- ~ -method getOutput
    lifo -swap
    ~ -to ~ -cat ~
    ~ -to ~ -cat /
    ~ -to ~ -cat @@lib
    ~ -to ~ -cat _
    ~ -to ~ -cat @@cmd
    ~ -to ~ -cat ${ .html }$
    ~ -to @filename -trim
    put @filename
    clHelp ( @@lib @@cmd @@s @@u @@p )
    desktop -browse @filename
]$
# end

cim

/0 [-to datasink] [--,-cim @ciminstance] [-keys ~@propertyKeyArray] [-namespace ~@{namespace}] [-objectname ~@{objectname}] [-url ~@{https://server:port}] [-xmlschema ~@{xmlschemaname}] [-new | -close | -path | -cred ~@{user} ~@{password} | -init ~@cimobjectpath | -ei ~@cimobjectpath] : CIM client

The cim command creates and employs Common Information Model clients. To do so it uses the SBLIM JSR-48 CimClient library available under the Eclipse Public License.

(See also cimi)

To use the CIM client:

  1. Create an instance.
  2. Assign credentials to the instance.
  3. Create a CIM path.
  4. Use the path to initialize the instance.
  5. Create a path to the CIM object to be accessed.
  6. Use the path an argument of the desired operation

This is illustrated as follows:

Example (examples/test/cimtest.ublu)

# cimtest.ublu ... exercise Common Information Model support in Ublu
# Example from Ublu Midrange and Mainframe Life Cycle Extension language
# https://github.com/jwoehr/ublu
# Copyright (C) 2017 Jack J. Woehr http://www.softwoehr.com
# See the Ublu license (BSD-2 open source)

include /opt/ublu/extensions/ux.cim.property.ublu

# cimtest ( url uid passwd namespc )
# ... url is something like https://myserver.foo.org:5989
# ... namespc is something like root/cimv2
FUNC cimtest ( url uid passwd namespc ) $[
    LOCAL @client LOCAL @path
    
    cim -to @client
    @client -cred @@uid @@passwd
    
    cim -to @path -url @@url -path
    @client -init @path

    cim -to @path -namespace @@namespc -objectname CIM_LogicalIdentity -path
    put -n -s ${ Enumerate Instances for }$ put @path
    @client -ei @path

    string -to ~ -new
    cim -to @path -namespace @@namespc -objectname ~ -path
    put -n -s ${ Enumerate Classes for }$ put @path
    @client -ec @path @true

    cim -to @path -namespace @@namespc -objectname IBMOS400_NetworkPort -path 
    put -n -s ${ Get Instances for }$ put @path
    @client -to @instances -ei @path
    
    FOR @i in @instances $[
        put -n -s ${ (( Instance looks like this ))  }$ put -from @i
        @client -to ~ -gi @i @false @false
        lifo -dup lifo -dup lifo -dup
        put ~
        put ${ *** Putting path for instance *** }$
        ~ -path
        put ${ *** Putting keys for instance *** }$
        ~ -keys
        put ${ *** Putting properties for instance *** }$
        ~ -to ~ -properties
        lifo -dup put ~
        FOR @i in  ~ $[
            put -n -s ${ ***** property is }$ put @i
            ux.cim.property.getName ( @i )
            put -n -s ${ ***** property name is }$ put ~
            ux.cim.property.getValue ( @i )
            put -n -s ${ ***** property value is }$ put ~
            ux.cim.property.hashCode ( @i )
            put -n -s ${ ***** property hashcode is }$ put ~
            ux.cim.property.isKey ( @i )
            put -n -s ${ ***** property is key? }$ put ~
            ux.cim.property.isPropagated ( @i )
            put -n -s ${ ***** property is propagated? }$ put ~
        ]$
    ]$
    @client -close
]$
# end

cimi

/0 [-to datasink] [--,-cimi @ciminstance] [-class | -classname | -hashcode | -keys | -key ~@{keyname} | -properties | -propint ~@{intindex} | -propname ~@{name} | -path] : manipulate CIM Instances

The cimi command manipulates CIM instances returned by the cim client. CIM instances are autonomic or can be provided to the cimi command via --,-cimi @ciminstance.

The dash-commands are as follows:

Note: There are extensions to Ublu CIM support in the file(s) extensions/ux.cim.*.ublu

See also: cim

collection

/0? [--,-collection ~@collection] [-to datasink] [-show | -size] : manipulate collections of objects

The collection command provides support for Java collections. The collection (created elsewhere) is provided to the command in a tuple argument to the -- or -collection dash-command.

The operations are as follows:

commandcall

/4? [-as400 ~@as400] [-to datasink] ~@{system} ~@{userid} ~@{passwd} ~@{commandstring} : execute a CL command
executes a CL command represented by commandstring on IBM i system system on behalf of userid with password. The commandstring is a quoted string representing the entire CL command and its arguments. If the -as400 dash-command is provided, then the arguments system userid and password must be omitted.

If the command results in one or more AS400 Messages, the Message List is put.

Note that CL display and menu commands do not work! Also, you may have to juggle shell quoting to issue such command lines; it's definitely easier to enter this sort of thing in interpretive mode or as an Ublu program or function.

Example

commandcall -to foo.txt somehost bluto ******** ${ SNDMSG MSG('Hello my friend') TOUSR(fjdkls) }$
# Performs a SNDMSG on SOMEHOST on behalf of user bluto sending the text "Hello my friend" to a user FJDKLS. Any output returned by the command is redirected to a local file foo.txt.
as400 -to @myas400 somehost bluto ******** # creates an as400 instance and stores it in the tuple variable @myas400
commandcall -as400 @myas400 -to foo.txt ${ SNDMSG MSG('Hello my friend') TOUSR(fjdkls) }$
# Performs a SNDMSG just as above using the as400 instance in place of a repetition of credentials.

const

/2? [-to datasink] [-list | -create | -clear | -defined ~@{*constname} | -drop ~@{*constname} | -save | -restore | -merge ] ~@{*constname} ~@{value} : create a constant value

The const command:

The default action is -create.

The name of the constant must start with an asterisk character * e.g., *someconstant.

A constant can be used as the argument to a command or dash-command where the syntax notation represent the argument as ~@{something} and only in such position.

If an operation expects a number, the constant when used may need to be converted via the num command, e.g., num -to ~ *someconstant

The operations are as follows:

See also: savesys

Example (examples/test/consttest.ublu)

# consttest.ublu
# Example from Ublu Midrange and Mainframe Life Cycle Extension language
# https://github.com/jwoehr/ublu
# Copyright (C) 2016 Jack J. Woehr http://www.softwoehr.com
# See the Ublu license (BSD-2 open source)

const *foo ${ this is a test }$
put *foo
const *arf 111
num -to @n -long *arf
put @n
interpret -block $[ put *arf ]$
put -to @i 1
put -to @limit 10
DO @i to @limit $[ put -n -s *arf put @i ]$
FUNC woo ( toput ) $[ put @@toput ]$
woo ( *arf )
woo ( *foo )
const -list
const -to @bar -list
put -from @bar
# end

cs

/4? [-to @var ] [--,-cs ~@cs] [-db,-dbconnected ~@db] [[[-new] -sq1 ~@{ SQL code ... }] | [-call] | [-in ~@{index} ~@object ~@{sqltypename}] | [-innull ~@{index} ~@{sqltypename}] | [-out ~@{index} ~@{sql_type} [-scale ~@{scale}] [-typedescription ~@{user_typename}]] | [-rs] | [-nextrs] | [-uc]] : instance and execute callable statements which JDBC uses to execute SQL stored procedures

The cs command creates and executes a JDBC CallableStatement used primarily for invoking SQL stored procedures. You need an instance returned by the db command to use cs.

The dash-commands are as follows:

Examples

I. Creating a schema and a table

# Connect naming arbitrary library, it doesn't matter which
db -to @db -dbtype as400 -connect MYSYS.FOO SOMELIB MYUID ********
# Creates a new library with journals etc.
# See IBM's _Database SQL programming_ (rbafypdf.pdf)
cs -db @db -to @cs -sql ${ CREATE SCHEMA JAXDTEMP }$
@cs -call
put -to @sql ${ CREATE TABLE JAXDTEMP.INVENTORY (PARTNO SMALLINT NOT NULL, DESCR VARCHAR(24), QONHAND INT,  PRIMARY KEY(PARTNO)) }$
# Creates new table
cs -db @db -to @cs -sql @sql
@cs -call

II. Executing arbitrary code in a callable statement

> db -to @myDb -dbtype as400 -connect PUB400.COM qsys2 me MYPASSWD
> cs -to @myCs -dbconnected @myDb -new -sql ${ CALL QCMDEXC('SNDMSG MSG(''Hi there'') TOUSR(JAX)') }$
> @myCs -call
false

image of WRKMSG screen

db

/4? [--,-dbconnected ~@dbconnected] [-as400 ~@as400] -dbtype,-db ~@{type} [-charsetname ~@{charsetname}] [-qopt ~@{close|hold|ro|update|forward|insensitive|sensitive}] [-rdb ~@{rdbname}] [-destqopt ~@{close|hold|ro|update|forward|insensitive|sensitive}] [-catalog | -columnnames ~@{tablename} | -columntypes ~@{tablename} | -connect | -csv ~@{tablename} [-separator ~@{separator} ] | -json ~@{tablename} | -disconnect | -metadata | -primarykeys ~@{tablename} | -query ~@{SQL string} | -query_nors ~@{SQL string} | -replicate ~@{tableName} ~@{destDbName} ~@{destDbType} ~@{destDatabaseName} ~@{destUser} ~@{destPassword} | -star ~@{tablename}] [-pklist ~@{ space separated primary keys }] [-port ~@{portnum}] [-destport ~@{destportnum}] [-property ~@{key} ~@{value} [-property ~@{key} ~@{value}] ..] [-ssl @tf | -usessl] ~@{system} ~@{schema} ~@{userid} ~@{password} : perform various operations on databases

The db command performs database operations on system's database schema for userid with password.

Alternatively, a connected database object can be provided via the -dbconnected ~@dbconnected dash-command in which case the system database userid password arguments must be omitted.

Note: For historic reasons, the dash-commands for this (early) Ublu command are stylistically inconsistent with the other commands: -db is now deprecated in favor of -dbtype and -dbconnected should really have been called -db. In any case, the eponymous dash-command -- @dbconnected can be used instead of -dbconnected

See also: cs rs

JDBC to the IBM i over SSL
Set your local environment up for SSL as described in the Note on SSL.
Operations

Note that table names are implicitly quoted in most db operations and thus should be regarded as case sensitive.

Options

db is very much a work in progress and thus features and refinements will follow.

Note: As configured, db supports IBM i database and PostgreSQL. Support for Microsoft SQL ® is also provided, if you care to download the JDBC driver yourself: it is not included with the distribution, which was, however, compiled to support the driver if present. Follow the directions in the README.DbMSSQL.txt file in the directory share/mssql .

Example (examples/dbexample.ublu)
# dbexample.ublu
# Example from Ublu Midrange and Mainframe Life Cycle Extension language
# https://github.com/jwoehr/ublu
# Copyright (C) 2016 Jack J. Woehr http://www.softwoehr.com
# See the Ublu license (BSD-2 open source)

# NOTE: The 'db' command is among the oldest code in Ublu and is currently
# being updated / reworked.
 
# Example session:
# > createPersons ( @dbinst )
# > put -to @address ${ 1313 Mockingbird Lane }$
# > addRowToPersons ( @dbinst 1 woehr jack @address golden )
# Query: INSERT INTO Persons VALUES ( '1','woehr','jack','1313 Mockingbird Lane ','golden')
# > addRowToPersons ( @dbinst 2 sillyperson monty 123_skidoo_street gotham )
# Query: INSERT INTO Persons VALUES ( '2','sillyperson','monty','123_skidoo_street','gotham')
# > starFromPersons ( @dbinst )
# PERSONID LASTNAME FIRSTNAME ADDRESS CITY
# woehr jack 1313 Mockingbird Lane }  golden
# sillyperson monty 123_skidoo_street gotham
# > starCSVFromPersons ( @dbinst )
# PERSONID,LASTNAME,FIRSTNAME,ADDRESS,CITY
# INTEGER,VARCHAR,VARCHAR,VARCHAR,VARCHAR
# jdbc type 4,jdbc type 12,jdbc type 12,jdbc type 12,jdbc type 12
# 1,woehr,jack,1313 Mockingbird Lane ,golden
# 2,sillyperson,monty,123_skidoo_street,gotham

# Get a DB instance
# E.g., myDb ( pub400.com as400 MYLIB1 myuserid mypassword @dbinst )
FUNC myDb ( sys type coll uid passwd dbinst ) $[
    db -to @@dbinst -db @@type -connect @@sys @@coll @@uid @@passwd
]$

# Create the table ... this will fail if table exists  
FUNC createPersons ( dbinst ) $[
    db -- @@dbinst -query_nors ${ CREATE TABLE Persons
    (
    PersonID int,
    LastName varchar(255),
    FirstName varchar(255),
    Address varchar(255),
    City varchar(255)
    ) }$
]$

# Create as many entries as you like one at a time, e.g.,
# put -to @addr ${ 23 Skidoo St. }$
# addRowToPersons ( @dbinst 2 Farfel Freddy @addr Gotham )
FUNC addRowToPersons ( dbinst id_int lastname firstname addr city ) $[
    LOCAL @query
    put -to @query ${ INSERT INTO Persons VALUES ( }$
    string -to @query -cat @query '
    string -to @query -cat @query @@id_int
    string -to @query -cat @query ','
    string -to @query -cat @query @@lastname
    string -to @query -cat @query ','
    string -to @query -cat @query @@firstname
    string -to @query -cat @query ','
    string -to @query -cat @query @@addr
    string -to @query -cat @query ','
    string -to @query -cat @query @@city
    string -to @query -cat @query ')
    put -n -s Query: put @query
    db -- @@dbinst -query_nors @query
]$
  
# List the table
FUNC starFromPersons ( dbinst ) $[
    db -- @@dbinst -star PERSONS
]$

# Get a CSV of the table
FUNC starCSVFromPersons ( dbinst ) $[
    db -- @@dbinst -csv PERSONS
]$

# Get a JSON of the table  
FUNC starJSONfromPersons ( dbinst ) $[
    db -- @@dbinst -json PERSONS
]$

# Get a JSON of one row of the table  
FUNC oneJSONfromPersons ( dbinst lastname ) $[
    put -to ~ ${ SELECT * FROM PERSONS WHERE LastName = }$
    string -to ~ -cat ~ '
    string -to ~ -cat ~ @@lastname
    string -to ~ -cat ~ '
    db -- @@dbinst -to ~ -query ~
    rs -- ~ -json @@dbinst PERSONS
]$
  
# Delete the PERSONS table
FUNC deletePersons ( dbinst ) $[
    db -- @@dbinst -query_nors ${ drop table PERSONS }$
]$

# End

dbug

/0 [-init] [-info] [-brk ~@opname] [-clr ~@opname] [[-step]|[-go]] [--,-dbug $[ execution block ]$] : dbug your program

The dbug command runs a stepping program debugger on an execution block you provide to the command.

dbug works in "plain" Ublu and under Goublu, but does not work in windowing Ublu.

The operations are as follows:

Before dbug -dbug $[ execution block ]$] executes its execution block, it discards the rest of the following input. Commands issued after a dbug command which includes the -dbug $[ execution block ]$ dash-command are thus ignored (though any other dbug dash-commands to the same instance of the dbug command are still parsed before execution of the debug block begins).

The dash-commands controlling dbug operations can be provided on the command line in any order. The debugger runs the execution block immediately after the other dash-commands have been processed.

Issuing the dbug command does not require an execution block to be provided via --,-dbug $[ execution block ]$. Its other dash-commands may be issued at any time to set the state of dbug prior to actually running it on an execution block. At any time the state of dbug can be reset via -init .

When dbug gets to a breakpoint or to the break at its next step when single-stepping, a prompt indicating the breakpoint is displayed and a command line. To continue in single-step mode, simply press the return key.

The valid commands at a breakpoint are as follows:

For other debugging aids, see the props command.

Example

> dbug -step -brk tuple -dbug $[ put -to @foo ${ foo bar woof }$ tuple -exists @foo put -from @foo ]$

at: put [-to, @foo, ${, foo, bar, woof, }$, tuple, -exists, @foo, put, -from, @foo]
brk>i
ublu.util.DBug@727e86b8
stepping : true
breaking : true
breakpoints:
------------
 tuple


at: put [-to, @foo, ${, foo, bar, woof, }$, tuple, -exists, @foo, put, -from, @foo]
brk>g

at: tuple [-exists, @foo, put, -from, @foo]
brk>
true

at: put [-from, @foo]
brk>
foo bar woof
>

defun

/2? [[-define] | [-list] | [-show name]] name tuplename | define a function from a name and a functor
defun takes a name and a the name of a tuple variable which refers to a functor and associates the functor with the name in the function dictionary, thus creating a function findable by command lookup.

The default operation is -define.

The operations are:
If the -list or -show name dash-command is present no other arguments should be present, and no arguments are consumed.

See also FUN FUNC

desktop

/0 [-browse ~@{uri} | -mail | -mailto ~@{uri} | -supported] : desktop browser or mail

The desktop command launches the user's default mailer or browser.

The dash-commands are as follows:

The default operation is -supported.

dict

/0 [-to datasink] [-from datasink] [-list | -save | -peek | -restore | -merge] : save and restore function dictionary

The dictionary of functions created by the action of FUNC and/or defun can be listed, saved, restored, or merged with the current dictionary.

The default operation is to list the dictionary of functions.

Note: Saved dictionaries are not compatible across individual builds of Ublu.

The operations are as follows:

See also
FUNC FUN defun savesys

DO

/5 [-undo] ~@iterator [to|TO] ~@limit $[ cmd .. ]$ : DO iterative from @iterator to @limit exclusive of limit incrementing/decrementing @iterator"

The DO command iterates over a block reinstancing the @iterator with the current loop index post-incremented or post-decremented (as directed) until the @iterator reaches the value of @limit, exclusive of the actual limit, or until a BREAK is encountered.

The limit never becomes the value of the @iterator which is only reinstanced within the loop. Upon reaching the limit, either incrementing or decrementing, the loop is not executed and ends instead.

The -undo dash-command indicates a decrementing, rather than incrementing loop.

Since @iterator is reinstanced each loop, it can be examined inside the block. The value of the @iterator after the loop remains whatever it was last set to in a loop, so it can be used to determine what iteration a BREAK occurred on.

The TO between tuple variable arguments to the DO command is optional.

Example

put -to @start 1
put -to @end 10
DO @start @end $[ put -n -s ${ index is }$ put -from @start test -to @break -eq @start 7 IF @break THEN $[ put ${ leaving ... }$ BREAK ]$ ]$
index is  1
index is  2
index is  3
index is  4
index is  5
index is  6
index is  7
leaving ...

dpoint

/0? [--,-dpoint @dpoint] [-to datasink] [[-dup] | [-dkey ~@{keytext}] [-addkey ~@{keytext}] [-type ~@{int|long|float} [-value ~@{value}] [-alertlevel ~@{alertlevel}] [-compare ~@{gt|gte|lt|lte|info|warn|crit}] [-msg ~@{msg}]] : create and manipulate monitoring datapoints

The dpoint command puts a correctly formed SystemShepherd ® datapoint. The datapoint can be incrementally or fully created and subsquently modified (when an extant datapoint is referenced via the --,-dpoint dash-command.

The dash-commands that define the datapoint are as follows:

dq

/4? [-as400 @as400] [--,-dq ~@dq] [-wait ~@{intwaitseconds}] [-authority *ALL|*CHANGE|*EXCLUDE|*USE|*LIBCRTAUT] [-saveSenderInformation ~@tf] [-FIFO ~@tf] [-forceToAuxiliaryStorage ~@tf] [-desc ~@{description}] [-keyed ~@tf] [-keylen ~@{intlength}] [-key ~@{key}] [-bkey ~@bytekey] [-searchtype EQ|NE|LT|LE|GT|GE] [-clear | -create ~@{maxentrylength} | -delete | -exists | -new,-instance | -peek | -query [ ccsid | description | fifo | forceauxstorage | keylen | maxentrylength | name | path | savesender | system ] | -read | -write ~@{data to write} | writeb ~@bytedata] ~@{dataqueuepath} ~@{system} ~@{userid} ~@{password} : manipulate a data queue on the host

The dq command manipulates a data queue or keyed data queue on the host. Strings or byte arrays can be written, peeked and read. When read or peeked, a DataQueueEntry or KeyedDataQueueEntry is returned. There are extensions in extensions/ux.dqentry.ublu to manipulate these entries.

If the -as400 dash-command is provided, then the arguments system userid and password must be omitted.

The -dq @dq dash-command is used to reference an existing dataqueue instance previously stored in some tuple variable of the user's choice such as @dq. If the -dq dash-command is used, then the arguments dataqueuepath system userid password must be omitted.

The dataqueuepath is a fully-qualified HFS path, e.g., /qsys.lib/qgpl.lib/events.dtaq

You must indicated at instancing time if a keyed or sequential data queue is intended. If -keyed is @false (the default) a sequential data queue is instanced.

The various operations are as follows:

The default operation is -new .

Example

# Instance a session with the target system
as400 -to @oss -usessl MYSYS MyProfile ********
# Instance a DQ
dq -to @q -as400 @oss /QSYS.LIB/UBLUTEST.LIB/TESTDQ.DTAQ
# Create the DQ on the system with a max entry len of 1111
@q -create 1111
# Reference a file which happens to have record len of 1111
file -to @f -as400 @oss -keyed /QSYS.LIB/UBLUTEST.LIB/TEST0008.FILE
# Open file (R)ead
@f -open R
# Position the file to (B)efore First Record
@f -pos B
# Get NEXT record
@f -to @rec -read NEXT
# Get record's data as bytes
@rec -to @data -getcontents
# Write record to dataqueue
@q -writeb @data
# Etc ...

dta

/0 [-as400 ~@as400] [-to datasink] [--,-dataarea ~@dataarea] [-path ~@{ifspath}] [-bytes] [-biditype ~@{biditype}] [-buffoffset ~@{buffoffset}] [-offset ~@{offset}] [-length ~@{length}] [-initlen ~@{initlen}] [-initdecpos ~@{initdecpos}] [-initval ~@{initval}] [-initauth ~@{initval}] [-initdesc ~@{initdesc}] [ -new,-instance CHAR|DEC|LOC|LOG | -create | -delete | -refresh | -query ~@{query(name|sys|length|path|decpos)} | -write ~@data | -read | -clear] : create and use data areas

The dta command manipulates IBM i data areas. Initialize an instance by providing a system via the -as400 ~@as400 and -path ~@{ifspath} dash-commands in conjunction with -new CHAR|DEC|LOC|LOG. This instance can refer to an extant data area or to one you intend to create. Thereafter operations proceed by providing the instance to the command via the --,-dataarea ~@dataarea dash-command.

eval

/2/3 [-to @var] ~@[inc dec max min + - * / % << >> ! & | ^ && || == > < <= >= != pct] ~@operand [~@operand] : arithmetic

eval performs arithmetic and logical operations on numbers. eval evalutes its operator and then exercises the operator on its one or more operands. The operator and all operands may each be a plain word or may be a tuple referencing the desired operator or operand. The operands are treated as Long. The result is put to the destination datasink, often a tuple variable.

The operators and their operands are:
Use test for string comparisons.

Example

eval >> 0xff 4
15

exit

/0 [-rc ~@int] : perform System.exit()

Exits the Java runtime immediately, terminating all threads and doing no special cleanup other than that provided by the Java runtime itself.

The exit will return to the host system with an exit code of 0 unless the -rc ~@int dash-command is used to provide an integer exit code.

Unless you need to force exit and/or return a special exit code, e.g., to a calling script, use bye instead.

file

/4? [-to @var ] [--,-file ~@file] [-as400 ~@as400] [-blocking ~@{numrecs}] [-keyed | -sequential] [-new | -add ~@{membername} ~@{description} | -create ~@{recordLength} ~@{fileType([*DATA|*SOURCE])} ~@{textDescription} | -createdds ~@{ddsPath} ~@{textDescription} | -createfmt ~@recFormat ~@{textDescription} | -commitstart ~@{lockLevel([ALL|CHANGE|STABLE])} | -commit | -rollback | -commitend | -lock ~@{locktype(RX|RSR|RSW|WX|WSR|WSW)} | -unlock | -del | -delmemb | -delrec | -getfmt | -setfmt ~@format | -open ~@{R|W|RW} | -close | -list | -pos ~@{B|F|P|N|L|A} | -recfmtnum ~@{int} | -read ~@{CURR|FIRST|LAST|NEXT|PREV|ALL} | -update ~@record | -write ~@record | -writeall ~@recordarray | -refresh] [-to datasink] ~@{/fully/qualified/ifspathname} ~@{system} ~@{user} ~@{password} : record file access

The file command provides IBM i record file access.

The /fully/qualified/ifspathname indicates a file or a member, e.g

If the -- @file or -file @file dash-commands are present, the ~@{/fully/qualified/ifspathname} ~@{system} ~@{user} ~@{password} arguments must be omitted.

If the -- @file or -file @file dash-commands are absent but the --as400 ~@as400 dash-command is present, the ~@{system} ~@{user} ~@{password} arguments must be omitted.

The basic regimen of the file command is as follows:

  1. Instance a file object via -new providing the -keyed or -sequential dash-command.
  2. Open the file object with -open optionally indicating which record format to use via the -recfmt# ~@{int} dash-command and a blocking factor via -blocking.
    • Alternatively, create the file first and then open it. Use -create{dds,fmt} to create the file whose object instance you have already instanced via -new.
  3. Perform operations with the various dash-command -list -pos -read -write.
  4. -close the file.
The operations of the dash-commands are as follows:

There are Ublu-coded extensions for this command in the extensions directory of the distribution.

Examples

Creating a file from DDS

as400 -to @sys mysys.com myid mypasswd
file -to @test1 -as400 @pub -keyed -new /QSYS.LIB/MYLIB.LIB/TEST1.FILE
@test1 -createdds /QSYS.LIB/MYLIB.LIB/QDDSSRC.FILE/TEST1.MBR ${ Physical file testing createdds }$


Opening an extant file

as400 -to @s mysys myid mypasswd
file -as400 @s -to @f -keyed /QSYS.LIB/MYLIB.LIB/QCLSRC.FILE/MYPROG.MBR
file -- @f -open RW
file -- @f -to @r -read CURR
put -from @r
1.00 160803 MAIN: PGM PARM(&P1 &P2)
num -to @i -bigdec 99
record -to @fmt -- @r -getfmt
record -to @newrec
record -- @newrec -setfmt @fmt
record -- @newrec -setfield 0 @i
file -- @f -pos A
file -- @f -write @newrec
file -- @f -close
# listsrc.ublu ... type out a program source member of a record file
# Example from Ublu https://github.com/jwoehr/ublu
# Copyright (C) 2016 Jack J. Woehr http://www.softwoehr.com
# See the Ublu license (BSD-2 open source)

# Given a server and a fully qualified IFS path to a source member
# type the contents of the file to standard out, e.g.,
#   listsrc ( @mysys /QSYS.LIB/MYLIB.LIB/QCLSRC.FILE/MYPROG.MBR )
# where @mysys is an as400 object.
FUNC listsrc ( sys fqp ) $[
    LOCAL @f LOCAL @lines LOCAL @recordarray
    file -as400 @@sys -to @f -keyed @@fqp
    file -- @f -open  R
    file -- @f -close
    file -- @f -to @recordarray -read ALL
    FOR @i in @recordarray $[
        put @i
    ]$    
]$

FOR

/5 @iteratorvar @valuevar $[ cmd .. ]$ : FOR @iteratorvar (IN | in) @valuevar execute block instancing @iteratorvar

The FOR command iterates over a block instancing the @iteratorvar with the next element from the enumerable @valuevar until the enumeration is exhausted. Only certain object types in @valuevar are supported. The IN between tuple variable arguments to the FOR command is optional. The list of supported types is currently:

If the value of @valuevar is null then no iterations will occur and execution will continue normally beyond the FOR block.

The @iteratorvar name provided to a FOR command is created by the FOR command in a more local context than the local context of any function in which the FOR appears, as the following session snippet illustrates:

> FUNC testForItTup ( list ) $[
$[) LOCAL @i
$[) put -to @i local@i
$[) FOR @i in @@list $[
$[$[) put -from @i
$[$[)  ]$
$[)  put -n -s ${ after loop LOCAL @i is : }$ put -from @i
$[) ]$
> testForItTup ( @mylist )
list item one
list item two
after loop LOCAL @i is :  local@i
>
In early versions of Ublu the @iteratorvar had to be declared explicitly and its value persisted beyond the FOR block. Nowadays in Ublu the FOR command instances a new more-local tuple variable in a context which no longer exists after the FOR block.

Example

The following will end all active jobs in the QINTER subsystem:

as400 -to @as400 MYSERVER qsecofr ******* # create an as400 instance
joblist -to @joblist -jobtype INTERACTIVE -active -as400 @as400 # get the filtered job list
FOR @j @joblist $[ job -job @j -end -1 ]$ # iterate through the list and controlled end (-1) each job

ftp

/3? [ -to datasink ] [ -tofile ~@destfile ] [ -pushrc ] -new [ -as400 ] [ -mode ~@{act|pas} ] [ -port ~@{portnum} ] [ -type ~@{asc|bin} ] | --,-session ~@session [ -cd ~@{path} | -cmd ~@{ command string } | -connect | -disconnect | -get ~@{remotefilepath} [-target ~@{localfilepath}] | -dir | -ls ~@{filespec} | -put ~@{localfilepath} [-target ~@{remotefilepath}] | -pwd ] ~@{system} ~@{userid} ~@{password} : FTP client with AS400-specific extensions

The ftp command provides a persistent FTP client. You instance a new session and issue ftp with any dash-command against that session.

FTP sessions stored in tuples are autonomic.

The ftp dash-commands are: Paths on the server should be specfied either starting with / (absolute) or ./ (relative).

All filenames and pathnames should be treated as case-sensitive.
Cautions
Example
>  ftp -to @iftp -as400 -mode pas -new pub400.com FOOBERG ********
>  @iftp -pwd
257 "/" is current directory.
>  @iftp  -cd /QSYS.LIB/FOOBERG1.LIB
250 "/QSYS.LIB/FOOBERG1.LIB" is current library.
> @iftp -dir
drwx------   1 FOOBERG      0          57344 Dec 15 03:47 FOO.FILE
-rwx------   1 FOOBERG      0          65536 Dec 10 01:18 MYTEST.SAVF
drwx------   1 FOOBERG      0          77824 Dec 08 04:43 TEST1.FILE
drwx------   1 FOOBERG      0          77824 Dec 08 05:05 TEST2.FILE

Ublu:1:UbluInterpreter.Thread[main,5,main]:INFO:ublu.command.CmdFTPNu.ftp():250 List completed.
> @iftp -cd /home/FOOBERG
250 "/home/FOOBERG" is current directory.
> @iftp -ls ublu/*.jar
-rwx------   1 FOOBERG      0        1253099 Oct 11 19:13 ublu.jar

Ublu:1:UbluInterpreter.Thread[main,5,main]:INFO:ublu.command.CmdFTPNu.ftp():250 List completed.
> @iftp -cd /home/FOOBERG -to @ftp_messages
> put @ftp_messages
250 "/home/FOOBERG" is current directory.
>  @iftp  -cd /QSYS.LIB/FOOBERG1.LIB
250 "/QSYS.LIB/FOOBERG1.LIB" is current library.
> @iftp -type bin -get MYTEST.SAVF
226 File transfer completed successfully.
> as400 -to @pubcom pub400.com FOOBERG ******** 
> savef -create -as400 @pubcom FOOBERG1 MYTEST2
> @iftp -type bin -target /QSYS.LIB/FOOBERG1.LIB/MYTEST2.SAVF -put  MYTEST.SAVF
226 File transfer completed successfully.
> savef -as400 @pubcom -list FOOBERG1 MYTEST2
FOOBERG1/FOO.FILE 36864
Sat Dec 10 01:18:20 MST 2016
PF
*SYSBAS:1

 


> @iftp -disconnect
> @pubcom -disconnect

oldftp Deprecated, use ftp instead.

Note: This is the old ftp command. If your code used that earlier ftp command and you want that functionality, simply change the command to oldftp.

/3? -cd path | -cmd ${ command string }$ | -disconnect | -get filepath | -list | -put filepath | -pwd | --,-session @session [ -as400 ] [ -from datasink ] [ -to datasink ] [ -mode act/pas ] [ -port portnum ] [ -type asc/bin ] [ -tofile destfile ] ~@{system} ~@{userid} ~@{password} : FTP client with AS400-specific extensions

The ftp command provides FTP client capability with a persistent session or with one-off capability. You may issue ftp with any dash-command and provide system userid password each time, or you can create and preserve a session to use with the various dash-commands.

The main ftp dash-commands are:
Paths on the server should be specfied either starting with / (absolute) or ./ (relative).

All filenames and pathnames should be treated as case-sensitive.

Destination for file transfers is by default the source filename in the current directory (server for -put, local for -get). Use either the -tofile filepath or the -to filepath dash-commands to indicate a different destination (on the server for -put, local for -get).

Additionally, settings are changed using the following dash-commands (alongside the main commands, or independently):
Cautions
Persistent sessions

Persistent sessions are provided by the --,-session dash-command. -session @session references a tuple var to hold the persistent session. Referencing a nonexistent tuple var in conjunction with providing the system userid password arguments to the ftp command cause the referenced tuple var to come into existence instanced with the FTP session. Thereafter, the system userid password arguments are not used: instead, each time the ftp command is issued, accompany it with the -session @session reference established at the start, e.g.:

ftp -session @mysess -port 2121 someserver myuserid mypasswd
ftp -session @mysess -cd someserverpath
ftp -session @mysess -mode pas -type bin -get somefile -to localfilepath
ftp -session @mysess disconnect

FUN

/6.. [-to @tuplename] ( parameter name list ) $[ an execution block possibly spanning lines ]$ : create a functor

A functor is created and put to the destination datasink, most usefully with the -to dash-command to store the new functor in @tuplename.

When the functor is defined, a list of form parameter names is provided. The names are plain in the list, but if they appear decorated with @@ (e.g., @@mySubstitution) in the execution block which follows the parameter list, then when the functor is invoked via CALL the list, usually tuple names (though plain words are permitted), provided to the CALL command will be substituted for the formal parameter names.

The parameter name list is surrounded by parentheses and separated by one space from the parentheses.

The execution block is surrounded by $[ and ]$ and separated by one space from them.

The entire parameter name list and opening $[ bracket of the execution block must all appear on the same line immediately following the FUNC command, though otherwise the execution block can span multiple lines.

Example

FUN -to @fun ( a b c ) $[ put -from @@a put -from @@b put -from @@c ]$
put -from @fun
ublu.util.Functor@5f40727a ( a b c ) $[ put -from @@a put -from @@b put -from @@c  ]$
put -to @aleph ${ this is a }$

put -to @beth ${ and here is b }$
put -to @cinzano ${ la dee dad }$
CALL @fun ( @aleph @beth @cinzano )
this is a
and here is b
la dee dad
FUN -to @fun ( ) $[ put ${ zero param functor }$ ]$
CALL @fun ( )
zero param functor
put -from @fun
ublu.util.Functor@67ba0609 ( ) $[ put ${ zero param functor }$  ]$

FUNC

/7?.. [-to datasink] [ -delete ~@{name} | -get ~@{name} | -list | -show ~@{name}] | name ( parameter name list ) $[ an execution block possibly spanning lines ]$ : define, display, delete, fetch functor for a named function

A function is created and added to the command lookup so that it will be found in normal command processing.

The default operation is -define. This dash-command need not be present.

The operations are:
If the -list or -delete name or -get name or -show name dash-command is present no other arguments should be present, and no arguments are consumed.

When the function is defined, a list of form parameter names is provided. The names are plain in the list, but if they appear decorated with @@ (e.g., @@mySubstitution) in the execution block which follows the parameter list, then when the function is invoked by command processing, the members of the list, usually tuple names (though plain words are permitted), provided to the function at execution type will be substituted for the matching formal parameter names as they appear in the function.

The parameter name list is surrounded by parentheses and separated by one space from the parentheses.

The execution block is surrounded by $[ and ]$ and separated by one space from them.

The entire parameter name list and opening $[ bracket of the execution block must all appear on the same line immediately following the FUNC command, though otherwise the execution block can span multiple lines.

See also: savesys

Example (from examples/jobstuff.ublu)
 7 # Show all jobs in a joblist
 8 FUNC showJobs ( joblist ) $[
 9     LOCAL @subsys LOCAL @type
10     FOR @j in @@joblist $[
11         put -n -s ${ job }$
12         put -n -s -from @j
13         job -job @j -get subsystem -to @subsys
14         job -job @j -get type -to @type
15         put -n -s -from @subsys put -n -s -from @type
16         put -n -s ${ is owned by }$ job -job @j -get user
17     ]$
18 ]$

as400 -to @myhost MYHOST SOMEUSERID ********
joblist -to @joblist -jobtype INTERACTIVE -active -as400 @myhost
showJobs ( @joblist )
job 329077/FRED/QPADEV001B /QSYS.LIB/QINTER.SBSD I is owned by FRED
job 328723/CREEP/CRP2 /QSYS.LIB/QINTER.SBSD I is owned by CREEP
job 328726/CREEP/CRP1 /QSYS.LIB/QINTER.SBSD I is owned by CREEP
job 329080/SHREDDER/QPADEV0012 /QSYS.LIB/QINTER.SBSD I is owned by SHREDDER

gensh

/5+ [-to datasink] [-strictPosix] [ [-path ~@{fullyqualifiedjarpath}] [-includepath ~@{searchpath}] [-opt optchar assignment_name tuplename ${ description }$ ..] [-optr optchar assignment_name tuplename ${ description }$ ..] [-opts optchar assignment_name ${ description }$ ..] [-optx optchar multiple_assignment_name tuplename ${ description }$ ..] [-prelude ~@{prelude command string ..] ] ~@{scriptname} ~@{includename} ~@{ functionCall ( @a @b ... ) } : generate launcher shell script

The gensh command generates a shell script to launch Ublu and call your top-level application program function with specified arguments provided via command-line switches as is customary with shell scripts.

In the generated script, arguments are translated from arguments of user-specified shell script options to tuples on the command line of the invocation. The specified file of Ublu functions is included and the specified function is invoked with arguments bound to the translated tuple definitions.

The script composed by gensh is put to the destination datasink. Typically you will write to a file via the -to filename dash-command.

The shell variable $SCRIPTDIR is set to the gensh script's directory at script runtime before any prelude commands are executed and before the Java invocation of Ublu.

The arguments to gensh which you use to create your script are re-created in a comment line at the head of the script to aid in modifying and regenerating the script or in the composition of similar scripts.

The dash-commands are as follows:

The arguments to the command are as follows:

Autogenerated Reserved Options

The following options are automatically added to the generated script and are reserved for gensh itself:

Globbing

Since asterix * is so prevalent in IBM i programming and therefore likely to appear in string arguments to gensh scripts, gensh scripts disable by default shell globbing (the automatic expansion/substitution of characters such as * ) of the command strings created by gensh scripts.

This allows the user to pass such characters (quoted against globbing at invocation of the script itself) to a gensh script

foo.ublu:

FUNC foo ( argument ) $[
    put @@argument
]$

... then ...

# a command to ublu to generate the file
gensh -to foo.sh -path /opt/ublu/ublu.jar -includepath $SCRIPTDIR -optr a ARGUMENT @argument ${ Example argument }$ ${ foo.sh: example script }$ foo.ublu ${ foo ( @argument ) }$

Then at runtime ...

$ ls
alpha  beta  foo.sh  foo.ublu

$ ./foo.sh silent -a 'this is a * test'  
this is a * test

$ ./foo.sh glob silent -a 'this is a * test'
this is a alpha beta foo.sh foo.ublu test
Silent includes

If the resultant script generated by gensh is invoked with the keyword silent before all options except the glob option (if the latter is present) (e.g., myscript.sh silent -a foo -b bar ... etc.) then included files do not echo and prompting is suppressed (via the inserted command string props -set includes.echo false props -set prompting false ).

Example

$ cat myFunc.ublu
FUNC myFunc ( xclude fnd ) $[
       put -n -s ${ excluding }$ put -from @@xclude
       put -n -s ${ finding }$ put -from @@fnd
]$

$ java -jar /opt/ublu/ublu.jar gensh -to foo.sh -optx x EXCLUDE @exclude \${ excludes 0 to many things }$ -opt f FIND @find \${ finds something }$ -path /opt/ublu/ublu.jar \${ foo.sh ... do something }$ myFunc.ublu \${ myFunc \( @exclude @find \) }$
$ ./foo.sh -x this -x that -f foo
:: FUNC myFunc ( xclude fnd ) $[
($[)        put -n -s ${ excluding }$ put -from @@xclude
($[)        put -n -s ${ finding }$ put -from @@fnd
($[) ]$
::
excluding  that this
finding  foo
$ ./foo.sh -x this -f foo
:: FUNC myFunc ( xclude fnd ) $[
($[)        put -n -s ${ excluding }$ put -from @@xclude
($[)        put -n -s ${ finding }$ put -from @@fnd
($[) ]$
::
excluding  this
finding  foo
$ ./foo.sh -x this
:: FUNC myFunc ( xclude fnd ) $[
($[)        put -n -s ${ excluding }$ put -from @@xclude
($[)        put -n -s ${ finding }$ put -from @@fnd
($[) ]$
::
excluding  this
finding  null
$ ./foo.sh
:: FUNC myFunc ( xclude fnd ) $[
($[)        put -n -s ${ excluding }$ put -from @@xclude
($[)        put -n -s ${ finding }$ put -from @@fnd
($[) ]$
::
excluding  null
finding  null

Here is the script file generated by gensh

# foo.sh ... do something 
# autogenerated Fri Nov 09 16:15:59 MST 2018 by jax using command:
# gensh -to foo.sh -optx x EXCLUDE @exclude ${ excludes 0 to many things }$ -opt f FIND @find ${ finds something }$ -path /opt/ublu/ublu.jar ${ foo.sh ... do something }$ myFunc.ublu ${ myFunc ( @exclude @find ) }$

# Usage message
function usage {
echo "foo.sh ... do something "
echo "This shell script was autogenerated Fri Nov 09 16:15:59 MST 2018 by jax."
echo "Usage: $0 [glob] [silent] [-h] [-X...] [-Dprop=val] [-x EXCLUDE [-x EXCLUDE ..]] [-f FIND]"
echo "  where"
echo "  -h      display this help message and exit 0"
echo "  -X xOpt     pass a -X option to the JVM (can be used many times)"
echo "  -D some.property=\"some value\" pass a property to the JVM (can be used many times)"
echo "  -x EXCLUDE [-x EXCLUDE ..]  excludes 0 to many things "
echo "  -f FIND finds something "
echo "---"
echo "If the keyword 'glob' appears ahead of all other options and arguments, only then will arguments be globbed by the executing shell (noglob default)."
echo "If the keyword 'silent' appears ahead of all options except 'glob' (if the latter is present), then included files will not echo and prompting is suppressed."
echo "Exit code is the result of execution, or 0 for -h or 2 if there is an error in processing options."
echo "This script sets \$SCRIPTDIR to the script's directory prior to executing prelude commands and Ublu invocation."
}

#Test if user wants arguments globbed - default noglob
if [ "$1" == "glob" ]
then
    set +o noglob # POSIX
    shift
else
    set -o noglob # POSIX
fi

#Test if user wants silent includes
if [ "$1" == "silent" ]
then
    SILENT="-silent "
    shift
else
    SILENT=""
fi

# Process options
while getopts x:f:D:X:h the_opt
do
    case "$the_opt" in
        x)  EXCLUDE="$OPTARG ${EXCLUDE}";;
        f)  FIND="$OPTARG";;
        h)  usage;exit 0;;
        D)  JVMPROPS="${JVMPROPS} -D${OPTARG}";;
        X)  JVMOPTS="${JVMOPTS} -X${OPTARG}";;
        [?])    usage;exit 2;;

    esac
done
shift `expr ${OPTIND} - 1`
if [ $# -ne 0 ]
then
    echo "Superfluous argument(s) $*"
    usage
    exit 2
fi

# Translate options to tuple assignments
if [ "${EXCLUDE}" != "" ]
then
    gensh_runtime_opts="${gensh_runtime_opts}string -to @exclude -trim \${ ${EXCLUDE} }$ "
else
    gensh_runtime_opts="${gensh_runtime_opts}tuple -null @exclude "
fi
if [ "${FIND}" != "" ]
then
    gensh_runtime_opts="${gensh_runtime_opts}string -to @find -trim \${ ${FIND} }$ "
fi

SCRIPTDIR=$(CDPATH= cd "$(dirname "$0")" && pwd)

# Prelude commands to execute before invocation
# No prelude commands

# Invocation
java${JVMOPTS}${JVMPROPS} -Dublu.includepath="" -jar /opt/ublu/ublu.jar ${gensh_runtime_opts} include ${SILENT}myFunc.ublu myFunc \( @exclude @find \) 
exit $?

help

/0 [-to datasink] [[--,-cmd ~@{commandname}] | [-all] | [-version]] [-linelen ~@{optional_line_length}] : display usage and help message

Displays formatted help for commands.

With no dash-command, displays a usage message and lists all built-in commands.

Same as usage.

histlog

/0 [-to datasink] [--,-histlog @histlog] [-as400 ~@as400 ] [-new,-instance] [-close] [-get] [-examine] [-jobs ~@listofjobs] [-severity 0-99] [-startdate yyyy/mm/dd] [-enddate yyyy/mm/dd] [-msgids ~@list]  [-msgidsinc omit|select] [-msgtypes ~@list] [-msgtypesinc omit|select] : get (filtered) history log

The histlog command retrieves, converts to a list and puts the host's history log. The log is filtered by the dash-commands specifying inclusions or exclusions.

history -or- h

/0 [-on | -off | -onfile filename | -do linenum [-change expr1 expr2] | -show [-to datasink] | -head numlines | -tail numlines | -name | -range firstline lastline]
manages history files of command lines entered into Ublu. The default history file name is ./Ublu.history.ublu .

History recording is off by default. If turned on, the history file will be created if it does not exist, and appended to if it does already exist.

Ublu remembers the last history file name set, so turning on history recording resumes with the last file name used for history.

When displaying history, the history display ends with the command line before the current one.

If no operation is selected using a dash-command, the default operation is -show.

To turn history recording on in a write-protected directory, be sure to use the -onfile dash-command in place of the -on dash-command.

Note that history can be saved in a tuple and later be the subject of an include.

Example:

> put ${ a a a }$
a a a
> put ${ B B B }$
B B B
> put ${ C C C }$
C C C
> history
1 put ${ a a a }$
2 put ${ B B B }$
3 put ${ C C C }$
> history -show -range 1 3 -to @my_program
> put -from @my_program
put ${ a a a }$
put ${ B B B }$
put ${ C C C }$
> include -from @my_program
:: put ${ a a a }$
a a a
:: put ${ B B B }$
B B B
:: put ${ C C C }$
C C C

host

/3 [-to @var] [-new,-instance] [-port ~@{portnum}] [-ssl ~@tf] [-usessl] ~@{hostname} ~@{user} ~@{password} : instance a smapi host, default port 44444

The host command creates and instance of a TCP/IP reference to a z/VM SMAPI host which can then be used as an argument to the smapi command. This is analogous to the use of an as400 object to access an IBM i server.

The dash-commands are as follows:

IF THEN ELSE

/1 [-!] ~@var : IF tests boolean @var (inverted by -!) and executes THEN $[ cmd cmd .. ]$ if true, ELSE  $[ cmd cmd .. ]$ if false

IF is a conditional execution command. IF tests the tuple variable provided, which is usually set by the test command. The variable contains the boolean value true or false. If true, the THEN block is executed. Otherwise, the ELSE block is executed.

Since the value null can indicate a tuple that has been created but not yet set to any value, the value null causes an error condition when encountered by an IF. Only true or false are valid values.

The execution blocks of THEN and ELSE can span multiple lines, however:

Example
put -to @foo ${ a }$
put -to @bar ${ a }$
test -to @woof -eq @foo @bar
IF @woof THEN $[ put ${ they are equal }$ ]$ ELSE $[ put ${ they are not equal }$ ]$
put -to @foo ${ all the beautiful horses }$
test -to @woof -jcls @foo java.lang.String
IF @woof THEN $[ put ${ tuple is indeed a java String }$ ]$ ELSE $[ put ${ tuple is not a java String }$ ]$

ifs

/4? [-ifs,-- ~@ifsfile] [-as400 @as400] [-to datasink] [-tofile ~@filepath] [-from datasink] [-fromfile ~@{filepath}] [-length ~@{length}] [-offset ~@{offset}] [-pattern ~@{pattern}] [-b] [-t] [-create | -delete | -exists | -file | -list | -mkdirs | -query ~@{[ccsid|name|ownername|owneruid|path|r|w|x} | -read | -rename ~@{/fully/qualified/path/name} | -set ~@{[ccsid|readonly]} ~@{value} | -size | -write [~@{string }] | -writebin ] ~@{/fully/qualified/pathname} ~@{system} ~@{user} ~@{password} : integrated file system access

The ifs command performs operations on the OS/400 integrated file system, operating on files and directories equally. Use the -file dash-command to create a reference to the IFS object which can be stored in a tuple variable for subsequent use with the -to and -from dash-commands as described below. If the -as400 dash-command is provided, then the arguments system userid and password must be omitted.

Note: The recommended way to pass an IFS file object already created to the ifs command is via the eponymous dash-command -ifs @ifsfile or -- @ifsfile ; the use of -from and -to to denote the IFSFile object in the syntax noted below is largely historical and now deprecated.

Note that if the -from datasink dash-command is used with the -write dash-command, the -from dash-command must appear in the command invocation before the -write dash-command. The reason is that the -write dash-command will expect to be followed by a quoted string for text to write unless it has already been established that the text to write will come from another datasink.

The operations specified by the dash-comands are as follows. -file is the default operation.

interpret

/0 [-block [~@{ block ... }$ | $[ block ...]$]] : run the interpreter, possibly on a provided block

The interpret command runs a nested instance of the interpreter. This command can come at the end of a commandline mode operation to launch the interpreter. If issued in interpretive mode, stacks another intepreter. One layer of interpreter unnests with each bye command (Ctl-D being the same as bye). All interpreter instances exit together when the exit command is encountered.

An execution block can be provided via the -block dash-command. In this case, the interpreter instance unnests at the end of the block.

Note that the execution block can be passed in $[ block ]$ syntax or in ${ quoted string }$ syntax, or simply as text in tuple variable or on the tuple stack.

Some state is passed back from nested interpreter sublevels to the previous interpreter level. See the material above on nested interpreter sublevels.

Example

> interpret
1> interpret
2> interpret
3> bye
2> bye
interpret -block $[ LOCAL @foo put -to @foo ${ bar }$ tuple -map put -from @foo ]$
@foo
bar
> tuple -map

>
1> bye
> bye

interpreter

/0 [-all | -getlocale | -setlocale ~@{lang} ~@{country} | -getmessage ~@{key} | -args | -opts | -arg ~@{nth} | -opt ~@{nth} | -optarg ~@{nth} | -allargs | -geterr | -getout | -seterr ~@printstream | -setout ~@printstream | -q,-query framedepth|instancedepth|forblock|breakissued|goublu|window] : info on and control of Ublu and the interpreter at the level this command is invoked

The interpreter command relates to the current level of the Ublu interpreter.

Example (examples/redir.ublu)

# FUNC redir  ( fqp block )
# ... fqp is a tuple variable with a filepath
# ... block is an execution block whose standard out is redirected to the file
# E.g., 
# put -to @fpath /foo/bar.txt
# put -to @block $[ put ${ this is a test }$ ]$
# redir ( @fpath @block )
FUNC redir  ( fqp block ) $[
    LOCAL @stdout LOCAL @ps
    interpreter -to @stdout -getout
    calljava -to @ps -new java.io.PrintStream -arg @@fqp
    interpreter -setout @ps
    interpret -block @@block    
    interpreter -setout @stdout
    calljava -method close -obj @ps
]$

include

/1 [-from datasink] [-s,-silent] [-if ~@tf | -!if ~@tf] ~@{filepath} : include commands from a text file or from another datasink for interpretation
include loads all lines from datasink, or, in the absence of the -from dash-command, from the file specified by filepath and interprets each line of commands as if they were entered interpretively. Multiple commands with their dash-commands and arguments can entered on a single line. Like any other command, if execution is successful, the system continues to interpret commands found after the include command. There is no particular limit to the length of line.

If any command in the included file fails, the include aborts with an error message and any other input which was found on the line which held the include command is discarded.

While reading the included file, each input line is echoed preceded by :: and any command output to standard out follows on the next line.

If -silent ( -s) is set, then echoing of input lines and prompting is suppressed until the end of the include.

If a file is included via the command line invocation of Ublu and the output is being captured and it is desirable to remove the input line echo, a pipe to a grep filter can eliminate the input lines. e.g.:

java -jar ublu.jar include somefile.ublu | grep -v '^\:\:' >output.txt

Inclusion of an absolute filepath is done literally as written. Relative paths are done with a special set of rules as follows:

Conditional inclusion can be effected via the -if and -!if dash-commands.

The source of the flag for conditional inclusion can be either a tuple variable or the lifo stack.

jmx

/0 [-from datasink] [-to datasink] [--,-jmx @jmx_instance] [-obj @obj_instance] [-protocol ~@rmi|iop|?] [-host ~@hostname|hostip] [-port ~@portnum] [-url ~@/remainder_of_url] [-role ~@${ rolename }$ ] [-password ~@${ password }$] [-connect | -close | -new,-instance | -get ~@${}domain ~@${}type ~@${}name | -attrib ~@${ attribute }$ | -attribs ~@${ attrib attrib ... }$ | -cdi ~@attribute | -datakey ~@attribute ~@key | -mbeaninfo |-query [ names | mbeans | class classname]] : perform JMX access to a JVM

The jmx command provides JMX access to a JVM, either remote or local, which has been started with java command line switches allowing such access, e.g., via a java invocation such as:
java -Dcom.sun.management.jmxremote.port=9999 \
-Dcom.sun.management.jmxremote.authenticate=false \
-Dcom.sun.management.jmxremote.ssl=false -jar /opt/ublu/ublu.jar
If the -jmx @jmx dash-command with an instanced tuple variable is provided the -host -port -protocol and -url dash-commands are ignored.

Certain dash-commands require also a JMX ObjectInstance (fetched prior via the jmx -get dash-command). This ObjectInstance is passed to the jmx command in an instanced tuple variable via the -obj @obj_instance dash-command.

The operations are as follows:
Example

To launch a JVM with JMX without authentication running Ublu :
java -Dcom.sun.management.jmxremote.port=9999 \
    -Dcom.sun.management.jmxremote.authenticate=false \
    -Dcom.sun.management.jmxremote.ssl=false \
    -jar /opt/ublu/ublu.jar
To launch a JVM with JMX with plaintext role/password authentication running Ublu :
java -Dcom.sun.management.jmxremote.port=9999 \
    -Dcom.sun.management.jmxremote.authenticate=true \
    -Dcom.sun.management.jmxremote.password.file=/Some/fullpath/chmod700dir/pfile.txt \
    -Dcom.sun.management.jmxremote.ssl=false \
    -jar /opt/ublu/ublu.jar
pfile.txt must be chmod 600 or chmod 400 and should be (if JRE properties are the installed defaults) of the form:
monitorRole s0m3pa55w0rd
controlRole n0th3rpa55wd
Then from anywhere (another machine or the same machine) use Ublu and try the following sequence:

jmx -to @jmx -instance -host example -port 9999
jmx -jmx @jmx -connect
(or jmx -jmx @jmx -connect -role monitorRole -password s0m3pa55w0rdif authentication is used as above)
jmx -jmx @jmx -query names
jmx -jmx @jmx -query mbeans
jmx -jmx @jmx -query class sun.management.MemoryPoolImpl
jmx -jmx @jmx -to @oi -get java.lang MemoryPool ${ Perm Gen }$
put -from @oi
jmx -jmx @jmx -obj @oi -mbeaninfo
jmx -jmx @jmx -obj @oi -attribs ${ CollectionUsageThresholdCount }$
jmx -jmx @jmx -obj @oi -attribs ${ Usage }$

job

/6? [-as400 ~@as400] [--,-job ~@job] [-to datasink] [-refresh] [-end ~@{delaytime} (*CNTRLD delaytime in seconds or -1 for default timeout, 0 means *IMMED) | -get ~@{property([name|number|system|user|description|type])} | -getsys | -hold ~@tf_holdspooledfiles | -info | -new,-instance | -noop | -query ~@{property ([user|curlibname|number|subsystem|status|activejobstatus|user|description|type|auxioreq|breakmsghandling|cachechanges|callstack|ccsid|completionstatus|countryid|cpuused|curlib|date|defaultwait|endseverity|funcname|functype|inqmsgreply|internaljobident|jobactivedate|jobdate|jobenddate|jobentersysdate|joblog|msgqfullaction|msgqmaxsize|jobqueuedate|statusinjobq|switches|outqpriority|poolident|prtdevname|purge|q|qpriority|routingdata|runpriority|scheddate|timeslice|workidunit])} | -release | -spec] ~@{jobName} ~@{userName} ~@{jobNumber} ~@{system} ~@{userid} ~@{password} : manipulate jobs on the host

Manipulates jobs on the host. If the -job @job dash-command with an instanced tuple variable is provided, the jobName, userName and jobNumber must be omitted.   If the -as400 dash-command is provided, then the arguments system userid and password must be omitted.

Job objects for use with the job command are typically obtained via the joblist command, e.g.,

as400 -to @as400 myserver myid mypassword
joblist -to @jlist -as400 @as400
FOR @i in @jlist $[ job -- @i -to @u -get number job -- @i -to @s -query status put -n -from @u put -from @s ]$
542365*ACTIVE
542366*ACTIVE
951470*OUTQ
844017*OUTQ
164172*OUTQ

The operations are as follows:

joblist

/3? [-as400 ~@as400] [-to datasink] [-username ~@{userfilter}] [-jobname ~@{jobfilter}] [-jobnumber ~@{jobnumfilter}] [-jobtype ~@{jobtype}] [-active [-disconnected]] system userid passwd : retrieve a (filtered) joblist

joblist retrieves a joblist from system on behalf of userid with password filtered by the filters indicated by the dash-commands -username -jobname and/or -jobnumber. Output is to standard out unless the -to dash-command is present to specify a filename or @variable. If the -as400 dash-command is provided, then the arguments system userid and password must be omitted.

Additional filters may be applied:

The -jobtype dash-command takes one of the following for its jobtype argument:

The jobtype filter defaults to ALL in the absence of the -jobtype dash-command.

The -active dash-command filters for only those jobs which are active.
The -disconnected dash-command appears only in conjunction with the -active dash-command and, if present, filters for only those active jobs which are disconnected.

Example

The following will end all active jobs in the QINTER subsystem:

as400 -to @as400 mybox qsecofr ******* # create an as400 instance
joblist -to @joblist -jobtype INTERACTIVE -active -as400 @as400 # get the filtered job list
FOR @j @joblist $[ job -job @j -end -1 ]$ # iterate through the list and controlled end (-1) each job

joblog

/0 [-as400 ~@as400] [--,-joblog ~@joblog] [-to datasink] [-msgfile ~@{/full/ifs/path/}] [-onthread ~@tf] [-subst ~@{message_substitution}] [ -add ~@{int_attrib} | -clear | -close | -dir ~@tf | -length | -new ~@{jobname} ~@{jobuser} ~@{jobnumber} | -qm ~@{offset} ~@{number} | -query ~@{dir|name|user|number|sys} | -write ~@{message_id} ~@{COMPLETION|DIAGNOSTIC|INFORMATIONAL|ESCAPE} ] : manipulate job logs on the host

The joblog command provides access to joblogs on the server.

The operations are as follows:

Example (examples/joblogstuff.ublu)

   1 # joblogstuff.ublu
   2 # Example from Ublu Midrange and Mainframe Life Cycle Extension language
   3 # https://github.com/jwoehr/ublu
   4 # Copyright (C) 2016 Jack J. Woehr http://www.softwoehr.com
   5 # See the Ublu license (BSD-2 open source)
   6 
   7 # Give an as400 a jobname, a jobuser and a jobnumber (the latter 3 all strings)
   8 # print all messages in the joblog for the job. If 'verbose' is true, extended
   9 # message info is displayed.
  10 # E.g.,
  11 #   as400 -to @mysys mysys myuid mypassword
  12 #   tuple -true @tf
  13 #   catJobLog ( @mysys MYDISPDEV MYUID 123654 @tf )
  14 #   tuple -false @tf
  15 #   catJobLog ( @mysys MYDISPDEV MYUID 123654 @tf )
  16 FUNC catJobLog ( sys jobname jobuser jobnumber verbose ) $[
  17     LOCAL @jl
  18     joblog -to @jl -as400 @@sys -new @@jobname @@jobuser @@jobnumber
  19     joblog -to ~ -- @jl -length
  20     joblog -- @jl -to ~ -qm 0 ~
  21     FOR @i in ~ $[ 
  22         IF @@verbose THEN $[
  23             put -from @i
  24         ]$ ELSE $[
  25             msg -- @i -message
  26         ]$
  27     ]$
  28 ]$
  29 

jrnl

This command has been removed along with all other dependencies on jtopenlite. Its functionality can be surpassed using Db2 for i Services QSYS2.JOURNAL_INFO() and QSYS2.DISPLAY_JOURNAL().

json

/0 [-from datasink] [-to datasink] [--,-json ~@json] [ [-add ~@object] | [-addkey ~@key ~@object] | [ -at ~@{index} ~@object ] | [-array] | [-cdl ~@{cdl}] | [-get ~@{index}] | [-key ~@{key}] | [-keys] | [-length] | [-list] | [-object] [-remove ~@{key}] ] : create and unpack JSON

The json command creates and unpacks JSON interchange format.

The operations are as follows: Example (examples/test/testjson.ublu)
   1 # testjson.ublu
   2 # Example from Ublu Midrange and Mainframe Life Cycle Extension language
   3 # https://github.com/jwoehr/ublu
   4 # Copyright (C) 2016 Jack J. Woehr http://www.softwoehr.com
   5 # See the Ublu license (BSD-2 open source)
   6 
   7 # Test the json command
   8 
   9 json -to @jsonarray -cdl ${ a , b,    c     ,  d, e, f }$
  10 put @jsonarray
  11 put -to ~  ${ { woof : 1.23e5, arf : "elephants are grey" } }$
  12 json -to @jsonobj -from ~ -object
  13 put @jsonobj
  14 put -to ~ 9.99
  15 json -- @jsonarray -add ~
  16 put @jsonarray
  17 json -- @jsonarray -to ~ -length
  18 lifo -dup
  19 put -n -s ${ length of JSON array: }$ put ~
  20 json -- @jsonarray -at ~ @jsonobj
  21 put @jsonarray
  22 
  23 FUNC listValues ( jsonObj ) $[
  24     LOCAL @keys
  25     json -- @@jsonObj -to ~ -keys
  26     json -- ~ -to ~ -list
  27     FOR @i in ~ $[
  28         json -- @@jsonObj -to ~ -key @i 
  29         put -n -s key: put -n -s @i put -n -s value: put ~
  30     ]$
  31 ]$
  32 listValues ( @jsonobj )
  33 put -to ~ ${ some elephants are pink }$
  34 json -- @jsonobj -addkey more_on_elephants ~
  35 put @jsonobj
  36 listValues ( @jsonobj )
  37 put ${ here is a list created from the JSON array: }$
  38 json -- @jsonarray -list
  39 json -- @jsonobj -remove arf
  40 put @jsonobj

jvm

/0 [-to @datasink] [ -new | -gc | -set ~@{key} ~@{val} | -get ~@{key}] : manipulate or report on the JVM on which this program is executing

license

/0 : [-to (@)datasink] : show software license

lifo

/0 [-to datasink] -push @tuplevar | -pop | -popval | -dup | -swap | -over | -pick ~@{0index} | -rot | -depth | -clear | -drop | -show | -true | -false | -null : operate on the tuple stack

The system maintains a Last In, First Out tuple stack. lifo operates on this stack. Many of the operations are upon the top of stack (TOS) item.

Except for -true -false and -null, lifo can push only extant tuples to the tuple stack. To automatically wrapper a non-tuple item in an anonymous tuple and push that tuple to the tuple stack, use put -to ~ item.

The operations are:

Example

> put -to ~ 1
> put -to ~ ${ foo bar woof }$
> calljava -to ~ -forname java.sql.DriverManager
> lifo -null
> num -to @fargle 42
> lifo -push @fargle
> lifo -show
top <== 
        @fargle=[42](class java.lang.Integer)
        null=[null](null)
        null=[class java.sql.DriverManager](class java.lang.Class)
        null=[foo bar woof ](class java.lang.String)
        null=[1](class java.lang.String)

list

/0 [-to datasink] [--,-list ~@list] [[-new,-instance] | [-source,-src ~@enumeration|~@collection|~@string|~@array] | [-add ~@object ] | [-addstr ~@{ some string }] | [-clear] | [-get ~@{intindex}] | [-set ~@{intindex} ~@object] | [-remove ~@object] | [-removeat ~@{intindex}] | [-size,-len] | [-toarray]]: create and manage lists of objects

The list command creates and manages lists of objects, any objects which you add with the -source -add or -set dash-commands. The list can be kept in a tuple variable, further manipulated by the list command when referenced via the -list @list dash-command, and iterated over by FOR.

The operations are as follows:

Example (examples/test/testlist.ublu)

   1 # testlist.ublu ... quick test of the list command
   2 # Example from Ublu Midrange and Mainframe Life Cycle Extension language
   3 # https://github.com/jwoehr/ublu
   4 # Copyright (C) 2016 Jack J. Woehr http://www.softwoehr.com
   5 # See the Ublu license (BSD-2 open source)
   6 list -to @mylist 
   7 put @mylist 
   8 list -- @mylist -addstr ${ this is a string }$
   9 put @mylist
  10 as400 -to @mysys mysys me abcd1234 
  11 list -- @mylist -add @mysys
  12 put @mylist 
  13 list -- @mylist -set 0 @mysys 
  14 put -from @mylist
  15 list -- @mylist -remove @mysys
  16 put @mylist
  17 list -- @mylist -addstr ${ this is another string }$
  18 put @mylist
  19 list -- @mylist -addstr ${ pardon me asking what's new }$
  20 put @mylist
  21 list -- @mylist -size
  22 list -- @mylist -removeat 2
  23 put @mylist
  24 list -- @mylist -size
  25 list -- @mylist -get 0
  26 list -- @mylist -get 1
  27 put @mylist
  28 list -- @mylist -size
  29 list -- @mylist -clear
  30 put @mylist
  31 list -- @mylist -size

LOCAL

/1 @localvar : declare local variable

LOCAL declares tuple variables local to a block. Any variable declared this way hides any global variable of the same name. Any variable declared this way hides any variable of the same name which is local to a block enclosing the current block. This declaration neither replaces nor alters any variable of the same name which is local to the current block.

A LOCAL variable disappears at the end of the current block, unhiding any enclosing LOCAL declarations or a global declaration.

Declaring a LOCAL outside a block generates an error.

LOCAL variables can be used as arguments to functions called from within the LOCAL-declaring block, but they cannot be used as arguments passed to other threads (see thread and TASK). The LOCAL variable does not exist in the context of the new thread.

Example (examples/test/testlocal.ublu)

   1 # testlocal.ublu
   2 # test LOCAL variable operation
   3 FUNC woof ( a b ) $[
   4     tuple -map
   5     LOCAL @i
   6     FOR @i in @@a $[
   7         tuple -map
   8         LOCAL @q
   9         put -n -s -to @q ${ @q will disappear }$
  10         put -n -s -from @q
  11         put -n -s ${ with a woof upon }$
  12         put -n -s -from @i put -from @@b
  13     ]$
  14       put -from @q
  15       tuple -map
  16       put -from @q
  17 ]$ 
  18 
  19 put -to @z ${ 1 2 3 }$ put -to @x ${ doggie }$ 
  20 woof ( @z @x )
  21 tuple -map
  22 

map

/0? [--,-map ~@map] [-to datasink] [-new | -> ~@tuple | -clear | -add ~@{key} ~@tuple | -~,-push ~@{key} | -.,-get ~@{key} | -drop ~@{key} | -keys | -size] : create and manipulate maps of tuples

The map command creates and operates on key-value maps of tuples.

map objects are autonomic.

The dash-commands are as follows:

Example

> map -to @m
> put -to @foo bar
> put -to @bar woof
> @m -> @foo
> @m -> @bar
> put @m
{foo=ublu.util.Tuple@4e515669 key="@foo" value="bar", bar=ublu.util.Tuple@17d10166 key="@bar" value="woof"}
> @m -~ bar
> put ~
woof
> @m -to @tupletuple -. bar
> put @tupletuple
ublu.util.Tuple@17d10166 key="@bar" value="woof"
> put -to ~ @tupletuple
> put ~
ublu.util.Tuple@17d10166 key="@bar" value="woof"
> put -to ~ @tupletuple
> tuple -value ~
ublu.util.Tuple@17d10166 key="@bar" value="woof"
> tuple -to ~ -value @tupletuple
> put ~
woof
> @m -drop foo
> put -to @xyz ${ this is a test }$
> @m -add zotz @xyz
> put @m
{zotz=ublu.util.Tuple@ee7d9f1 key="@xyz" value="this is a test ", bar=ublu.util.Tuple@17d10166 key="@bar" value="woof"}
    

monitor

/3? [-as400 ~@as400] [-none|-status|-version|-all] system userid passwd : fetch system monitor data and create System Shepherd [TM API] datapoints

monitor performs one pass of gathering a very limited subset of monitoring data on system on behalf of userid with password and putting the data as System Shepherd [TM Absolute Performance, Inc.] datapoints.

Future Ublu monitoring tools will be coded as extensions leveraging Db2 IBM i Services (QSYS2)

The default is -none.

Note: It is a good idea to -disconnect the as400 object used by the monitor command after the command is finished.

Note: The -diskstatus operation is broken in JTOpenLite and has been removed. Monitoring of diskstatus should be done via the Ublu extension extensions/sysshep/sysshep.qsys2.system_status.ublu

msg

/0 --,-msg ~@message [-to datasink] [-sender | -user | -key | -fromjob | -fromjobnumber | -fromprogram | -message | -queue] : examine queued messages

The msg command puts components of a queued message to the destination datasink. The -msg ~@message  dash-command must be present. The operations are as follows:

msgq

/4? [[-as400 ~@as400] [--,-msgq ~@messagequeue]] [-to datasink] [-new,-instance | -close | -query | -remove messagekey | -removeall | -sendinfo ~@${ message text ... }$ | -sendinquiry ~@${ message text} ~@replyqueueIFSpath | -sendreply messagekey ~@{reply text} | -sendreplybinkey ~@bytearraykey ~@{reply text}] [[-all ] | [[-none] [-reply] [-noreply] [-copyreply]]] ~@system ~@fullyqualifiedifspath ~@userid ~@passwd : send, retrieve, remove or reply messages

The msgq command sends, retrieves, removes or replies messages from the message queue instance represented by the message queue object supplied to the -msgq dash-command  or in system's queue fullyqualifiedifspath on behalf of userid with password. Output is to standard out unless the -to dash-command is present to specify a filename or @variable. If the -as400 dash-command is provided, then the arguments system userid and password must be omitted. If the -msgq dash-command is provided, then the arguments system fullyqualifiedifspath userid and password must be omitted.

It is recommended that you use -close to close the message queue when you are done with it.

Reading messsages

Selectors may be applied to the message retrieval. They are processed in the order read on the command line and complement or override each other as read sequentially. The default is -all. If any selector other than -all is chosen, only that selector and any other specified selectors are included.

Replying to an inquiry message

Examples

msgq mysys /QSYS.LIB/QGPL.LIB/JUST4BCKUP.MSGQ qsecofr ******** (Retrieves all messages)
msgq mysys -reply -copyreply /QSYS.LIB/QSYSOPR.MSGQ qsecofr ******** (Only retrieves messages needing a reply and copy messages needing a reply)
as400 -to @a mysystem myid mypassword
msgq -as400 @a /QSYS.LIB/QUSRSYS.LIB/MYID.MSGQ # get all messages
ID:  | Key: 00000180 | Severity: 99 | From Job: SOMEJOB | From Job Number: 072176 | From Program: QUICMD | Date: Sun Jan 12 20:35:23 MST 2014 | Sending User: SOMEUSER | Message Help:  | Text: this is a foo  message
msgq -as400 @a -sendreply 180 ${ how's by you }$ /QSYS.LIB/QUSRSYS.LIB/JWOEHR.MSGQ # the message is now replied

Here is a more sophisticated example which retrieves all messages requiring a reply and offers the user a chance to reply each.

   1 # autoreply.ublu ... Find and reply to all *INQ & *NOTIFY messages in a given MSGQ
   2 # jack j. woehr jwoehr@absolute-performance.com jwoehr@softwoehr.com
   3 # 2015-03-10
   4 
   5 # instance message queue
   6 FUNC getMsgQ ( system user password ifspath msgq ) $[
   7     LOCAL @as400
   8     as400 -to @as400 @@system @@user @@password
   9     msgq -to @@msgq -as400 @as400 -instance @@ifspath
  10 ]$
  11 
  12 # get list of messages needing reply
  13 FUNC getReplyMsgs ( msgq replylist ) $[
  14     msgq -- @@msgq -query -reply -to @@replylist
  15 ]$
  16 
  17 # get messages, walk list and offer user chance to reply to each
  18 FUNC autoreply ( system user password ifspath ) $[
  19     LOCAL @msgq LOCAL @replylist LOCAL @key
  20     LOCAL @answer LOCAL @tf LOCAL @reply
  21     getMsgQ ( @@system @@user @@password @@ifspath @msgq )
  22     getReplyMsgs ( @msgq @replylist )
  23     FOR @msg in @replylist $[
  24         msg -- @msg -to @key -key
  25         put -from @msg
  26         ask -to @answer -say ${ Do you wish to reply to this message? (y/n) }$
  27         test -to @tf -eq @answer y
  28         IF @tf THEN $[
  29             ask -to @reply -say ${ Please enter your reply }$
  30             msgq -- @msgq -sendreplybinkey @key @reply
  31             put ${ Reply sent. }$
  32         ]$
  33     ]$
  34 ]$
  35 

num

/1 [-to (@)datasink] [[-bin] | [-byte] | [-int] | [-short] | [-double] | [-long] | [-float] | [-bigdec] [-radix ~@{radix}] ~@{numstring} : convert string to number class instance

The num command converts a string or number stored in a tuple variable to the specific Java class wrapper type in the specified radix. It is not an error to convert a number to itself.

The -bin conversion converts the first byte of its argument to an unsigned int.

If conversion fails, null is put.

Used to create types for commands such as record and calljava command.

Example (examples/test/testnum.ublu)
   1 # testnum.ublu ... test the num command
   2 # Example from Ublu https://github.com/jwoehr/ublu
   3 # Copyright (C) 2016 Jack J. Woehr http://www.softwoehr.com
   4 # See the Ublu license (BSD-2 open source)
   5 
   6 num -to @anInt -int 1234
   7 put -from @anInt
   8 tuple -typename @anInt
   9 num -to @aBigDec -bigdec 1234.56
  10 put -from @aBigDec
  11 tuple -typename @aBigDec
  12 put ${ num -bin will convert "abcd" to the unsigned int value of the letter 'a' }$
  13 num -bin ${ abcd }$ 
  14 

objdesc

/0 [-as400 ~@as400] [-to datasink] [--,-objdesc ~@objdesc] [-path ~@{ifspath}] [-new,-instance] | [-refresh] | [-query exists | library | name | path | type] | [-valuestring ~@{attribute}] | -refresh | -locks] : examine an object description

The objdesc command retrieves the object description for the object indicated by the -path ~@{ifspath} dash-command on the system indicated by the -as400 ~@as400 dash-command.

The operations are as follows

There are extensions for this command dealing with the locks in the object lock list returned by objdesc -locks coded in Ublu in extensions/ux.user.ublu

objlist

/0 [-as400 ~@as400] [-to datasink] [--,-objlist ~@objlist] [-lib libspec ] [-name objname] [-type objtype] [-asp ~@{ALL|ALLAVL|CURASPGRP|SYSBAS}] [-new,-instance] [-list] : retrieve a (filtered) object list

The objlist command retrieves a ObjectList of OS400 objects from the host.

The operations are as follows:

Example

   1 # listobjs.ublu ... list objects
   2 # jwoehr@absolute-performance.com jwoehr@softwoehr.com
   3 # 2015-03-12
   4 # example:
   5 # listObjs ( s0mesys qsecofr xyz1234 *ALL FRED *ALL )
   6 
   7 # list specified objects on specified system to a datasink
   8 # example:
   9 # listObjs ( @sys *ALL FRED *ALL @mylist)
  10 FUNC listObjsTo ( sys lib name type sink ) $[
  11     objlist -to @objlist -as400 @@sys -lib @@lib -name @@name -type @@type
  12     objlist -to @txtlist -- @objlist -list
  13     string -to @stringbuff -new
  14     FOR @obj in @txtlist $[
  15         string -to @stringbuff -cat @stringbuff @obj
  16         string -to @stringbuff -nl @stringbuff
  17     ]$
  18     put -from @stringbuff -to @@sink  
  19 ]$
  20   
  21 # list specified objects on specified system
  22 # example:
  23 # listObjs ( s0mesys qsecofr xyz1234 *ALL FRED *ALL )
  24 FUNC listObjs ( system user password lib name type ) $[
  25     LOCAL @sys LOCAL @objlist LOCAL @txtlist
  26     as400 -to @sys @@system @@user @@password
  27     listObjsTo ( @sys @@lib @@name @@type STD: )
  28 ]$
  29 # end
  30 

outq

/4? [-as400 @as400] [--,-outq ~@outqueue] [-to @var] [-from @qnamevar] [-clear [[user jobuser] | [form formtype] | all]] | [-get ~@{attributename}] | [-getfloat ~@{attr_int}] | [-getint ~@{attr_int}] | [-getstring ~@{attr_int}] | [-hold] | [-new,-instance] | [-noop] | [-release]] outputqueuename system user password : operate on output queues

The outq command operates on output queues. If the -as400 dash-command is provided, then the arguments system userid and password must be omitted.

If instead the Output Queue is specifed by the -outq @outq dash-command, then the outputqueuename system user and password arguments must be omitted, as well as the -as400 @as400 dash-command.

The outputqueuename is a fully-qualified IFS pathname for the queue, e.g., /QSYS.LIB/QUSRSYS.LIB/MYQUEUE.OUTQ

If the -from @qnamevar or -from qnamefile dash-command is provided, the queue name comes from the String in the tuple variable or the first line of the file, respectively. In either case, omit the queue name argument.

The default operation ( represented by the -instance dash-command) is to put the OutputQueue object to the destination datasink, so the -to @varname dash-command can be used to store the object to a tuple variable for later re-use with the -outq dash-command.

The supported operations represented by the dash-commands are as follows:

ppl

/0 [--,-ppl @ppl] [-new,-instance] [-get[int|float|string] ~@{paramid}] | -set[int|float|string] ~@{paramid} ~@{value} : create and manipulate print parameter list

The ppl command creates a Print Parameter List suitable for use with spoolf -create .

Create an instance with -new,instance and then use the instance via the -ppl dash-command to set or get parameters by ID via
-instance is deprecated, use -new instead

printer

/4? [-as400 @as400] [--,-printer ~@printer] [-to @var] [[-get ~@{attr_name}] | [-getfloat ~@{attr_int}] [-getfloat ~@{attr_int}] | [-getint ~@{attr_int}] | [-getstring ~@{attr_int}] | [-new,-instance] | [-set ~@{attribute} ~@{value}] [-wtrjob]] ~@{printername} ~@{system} ~@{user} ~@{password} : instance as400 printer and get/set attributes

The printer command allows you to instance a printer and get or set printer attributes. If the -as400 dash-command is provided, then the arguments system userid and password must be omitted.

If instead the printer is specified by the -printer ~@{printer} dash-command, then the printername system user and password arguments must be omitted, as well as the -as400 @as400 dash-command.

The operations are as follows:
Example

>as400 -to @s mybox myname mypwd
>printer -to @p -as400 @s SOME_LPR
>printer -- @p -get OUTPUT_QUEUE
/QSYS.LIB/QUSRSYS.LIB/SOME_LPR.OUTQ

programcall

/3? [-as400 ~@as400] [-to datasink] -program fullyqualifiedprogrampath [-in ~@tuple ~@{length} ~@{vartypename} [-in ..]] [-inout ~@tuple ~@{length} ~@{vartypename} [-inout] ..] [-msgopt ~@{all|none|10}] [-out ~@tuple ~@{length} ~@{vartypename} [-out ..]] ~@system ~@userid ~@passwd : invoke a program with parameters on the host<

The programcall command invokes a program to run on the server and puts an object representing the result of the call.

To use programcall you must understand precisely the program you are invoking, its input and output parameters.

In specifying parameters, they are added to the array of parameters to be passed to the program in the order you specify them to the command. Any number used by the program can be specified and they are added to the array as you specify them.

When a variable type is called for, that value is one of:

The dash-commands are as follows:

props

/0 [-to datasink] -set ~@${ name }$ ~@${ value }$ | -get ~@${ name }$ | -list | -read ~@${filepath}$ | -write ~@${filepath}$ ~@${comment}$ : manage properties

The props command set properties, lists them, and reads or writes properties files.

The interpreter recognizes certain properties and alters its behavior accordingly. The list of properties currently recognizes is as follow:

interpreter property description default value alternative value(s)
includes.echo Do included files echo to ERR: as they load? true false
prompting Does the system prompt and echo lines of blocks to ERR: as they are interpreted? true false
dbug.tuple.map true enables debug messages when the tuple map is manipulated false true
dbug.tuple true enables debug messages when the tuples are manipulated false true
signon.handler.type Controls how as400 signon errors (bad userid, bad password) are handled. Valid values are:
  • BUILTIN means signon errors are handled by JTOpen which attempts to open an AWT window prompting the user and fails mysteriously if no GUI is active.
  • CUSTOM means signon errors are handled by Ublu's handler, which prompts the user in text mode.
  • NULL means signon errors cause the erroneous command to fail without prompting.
CUSTOM BUILTIN
NULL
signon.security.type Controls whether as400 instances created subsequently without the explicit dash-commands -ssl or -usessl are secured by the Secure Sockets Layer (SSL). Valid values are:
  • SSL means SSL-secured as400 instances will be created until this property is changed
  • NONE means as400 instances will be created without SSL security until this property is changed
NONE SSL
ublu.includepath

Consists of colon-separated path elements indicating the path of included files which are specified as relative paths.

This property is imported at startup from the System properties provided by Java if it is set.

See include.

Loaded from system java properties or environment, otherwise, none. See include. colon-separated paths such as /opt/ublu/examples:/opt/ublu/extensions
ublu.usage.linelength

A value such as 80 which indicates the length at which to break lines delivered by usage/help descriptions of Ublu commands.

This property is imported at startup from the System properties provided by Java if it is set.

See usage.

Loaded from system java properties, otherwise 80. a reasonable value of character width, e.g., 100
Other properties can be set and retrieved arbitrarily for program usage.

Properties can be loaded from a file or written out to a file at any time. Users may thus save properties files and load them on the command line at invocation to control the behavior of the interpreter.

The operations are as follows:

put

/1? [-to datasink] [-tofile ~@filepath] [-from datasink ] [-fromfile ~@filepath] [-append] [ -toascii ] [ -charset srccharsetname ] [-n] [-s] [ -# ~@{numberstring} | ~@{object or a string} | a single lex ] : put data from datasink to datasink, typically in string form (with some exceptions), optionally translating charset if -toascii or -charset are set

The put command puts data from the source datasink ( Standard Input ( called STD: ), filename, @variable, the Tuple stack signified by ~, or NULL: ) specified by the -from dash-command to the destination datasink ( Standard Output ( called STD: ), filename, @variable, the Tuple stack signified by ~, or NULL: ) specified by the -to dash-command, optionally translating charset if -toascii or -charset are set.

In the absence of -to and/or -from (or -tofile and/or -fromfile) the respective datasinks are by default Standard Input and Standard Output.

The -tofile dash-command takes a tuple name or a tuple from the tuple stack and uses its string value as a filename to set the destination datasink.

The -fromfile dash-command takes a tuple name or a tuple from the tuple stack and uses its string value as a filename to set the source datasink.

If the data is from the command line (STD:), it can be a quoted string , an object in a tuple either named inline (@sometuple) or residing on the tuple stack  ( ~ ), or a single lex ( a single word). In this case its most simple string value is what is put to the destination datasink.

Example

The following example uses the file command to fetch info about the members of an IBM i record file into a tuple @ml then first does a put @ml then a put -from @ml illustrating the difference between the two semantics of put.

> file -to @f -as400 @s -keyed /QSYS.LIB/MYSRC.LIB/QCLSRC.FILE
> file -- @f -to @ml -list
> put @ml
com.ibm.as400.access.MemberList@47c72280
> put -from @ml
ACCESS_PATH_MAINTENANCE:        0
ACCESS_PATH_SIZE:       0
ACCESS_PATH_SIZE_MULTIPLIER:    1
ALLOW_DELETE_OPERATION: false
... etc.

record

/0 [-to @var] [--,-record ~@record] [ -getfmt | -getcontents | -getfield ~@{index} | -getfieldbyname ~@{fieldname} | -getfields | -new | -setcontents ~@contents | -setfield ~@{index} ~@object | -setfieldbyname ~@{fieldname} ~@object | -setfmt ~@format | -tostring ] : manipulate record file records.

The record command manipulates records in classic record files operated on by the file command.

The record itself is either retrieved via file -read or created via record -instance.

Once instanced, the record is reference via the -- or -record dash-command.

record objects are autonomic.

Ublu provides extensions to handle record formats in the file extensions/ux.format.ublu

The dash-command operations are as follows:
Example

as400 -to @s mysys myid mypasswd
file -as400 @s -to @f -keyed /QSYS.LIB/MYLIB.LIB/QCLSRC.FILE/MYPROG.MBR
file -- @f -open RW
file -- @f -to @r -read CURR
put -from @r
1.00 160803 MAIN: PGM PARM(&P1 &P2)
num -to @i -bigdec 99
record -to @fmt -- @r -getfmt
record -to @newrec
record -- @newrec -setfmt @fmt
record -- @newrec -setfield 0 @i
file -- @f -pos A
file -- @f -write @newrec
file -- @f -close

rs

/0 [--,-rs ~@rs] [-to datasink] [-tofile ~@filepath] [-from datasink] [[-abs ~@{row}] | [-rel ~@{rows}] | [-before] | [-after] | [-first] | [-last] | [-rownum] | [-rawrs] | [-autocommit 0|1] | [-bytes ~@{index}] | [-close{|db|st} [tuplename]] | [-commit ~@resultSet] | [-fetchsize numrows] | [-get ~@{index}] | [-lget ~@{label}] | [-getblob ~@{index}] | [-lgetblob ~@{label}] | -insert | [-json ~@db ~@{tablename}] | [-next] | [-split split_specification] | [-toascii numindices index index ..] | [-metadata]] : operate on result sets

The rs command operates on result sets.

Examples

Inserting the source result set into the destination result set updating the actual destination table while converting from EBCDIC to ASCII:

db -db as400 -query ${ SELECT * FROM ARF.WOOF }$ -to @srctable onesys farfel fred ********
db -db postgres -query ${ SELECT * FROM ARF.WOOF }$ -to @desttable anothersys test fred ********
rs -insert -toascii 3 1 2 3 -from @srctable -to @desttable
rs -close @srctable
rs -close @desttable

Inserting the source result set into the (empty) destination result set of a new table created on the spot updating the actual destination table while converting from EBCDIC to ASCII and splitting the third column of a three (3)-column table into three columns for a total of five (5) columns:

db -db postgres -query_nors ${ create table woof (a varchar, b varchar, c varchar, d varchar, e varchar, primary key (a,b,c,d,e)) }$ onesys test fred ********
db -db as400 -query ${ SELECT * FROM ARF.WOOF }$ -to @srctable anothersys farfel bluto ********
db -db postgres -query ${ SELECT * FROM woof }$ -to @desttable onesys test fred ********
rs -split 3 3 10 40 259 -toascii 3 1 2 3 -from @srctable -to @desttable

savf deprecated, use savef instead

/5 -create | -delete | -exists | -list | -restore | -save [ -lib libname ] [ -obj objectname [ -obj objname ...]] [ -path pahtname [ -path pathname ...]] system library savefilename userid password : perform various savefile operations

savf operates on the save file specified by system library savefilename on behalf of userid password. All filenames and pathnames should be treated as case-sensitive.

savef

/2? [-as400 ~@as400] [-to datasink] [--,-savef ~@savef] [ -lib ~@libname ] [ -obj ~@objectname [ -obj ~@objname ...]] [ -path ~@pathname [ -path ~@pathname ...]] [-tolib ~@{libname}] [-create | -delete | -exists | -list | -new | -restore | -save ] ~@{libraryname} ~@{savefilename} : instance and perform various savefile operations

savef instances and operates on a savefile. To instance, the IBM i savefile is specified by an as400 object and libraryname savefilename. -new is the default operation.

All filenames and pathnames should be treated as case-sensitive.

savesys

/0 [-to datasink] [-from datasink] [-merge] [-save | -restore] : save and restore compiled code

The savesys command saves your compiled Ublu code to a file which can subsequently be restored, but only to the exact same level of the Ublu system. The dash commands are as follows:

server

/0 [-to datasink] [-- @listener] [-inetaddr ~@{inetaddr}] [-port ~@{portnum}] [-backlog ~@{backlog}] [-usessl] [-ssl @~t/f] [ -block ~@{executionBlock} | $[execution block]$ ] -getip | -getport | -start | -status | -stop : start, stop or monitor status of a thread server

Starts, stops, queries settings, or displays status of a TCP/IP listener over portnum (defaulting to port 43860) which spawns server threads. The server threads spawned by the listener either will:

Whichever is chosen, all server threads spawned by one instance do the same thing.

The -start dashcommand puts the listener created, which should be kept in a variable, e.g., @listener, for later use with server -- @listener status or server -- @listener stop.

The default operation is -status.

There are extensions to the server command in extensions/ux.server.ublu

There is currently no authentication paradigm. Using the server grants any caller without authentication the full rights of the user who launched Ublu.

Therefore, the safe(st) way (available) to use the server is to write a FUNC to service the user and call that FUNC in a block provided with the -block $[execution block]$ dash-command. In this mode of operation, when the execution block completes, or when an error is encountered, the server will disconnect from the client, so the client is prevented from executing arbitrary commands.

Using SSL with the server
You can instance an Ublu server to accept only SSL connections. Serving up only a specified server program via -block $[execution block]$ and using SSL is the most secure path to creating an Ublu server.

Using SSL requires that you:

Here we will illustrate creating a self-signed certificate and invoking Ublu correctly.
mkdir /opt/ublu/keystores
cd /opt/ublu/keystores
keytool -genkey -trustcacerts -alias ubluservercert1 -keypass aaa4321 -keystore ubluserverstore -storepass aaa4321
What is your first and last name?
  [Unknown]:  Jack
What is the name of your organizational unit?
  [Unknown]:  Ublu HQ
What is the name of your organization?
  [Unknown]:  SoftWoehrLLC
What is the name of your City or Locality?
  [Unknown]:  Fairmount
What is the name of your State or Province?
  [Unknown]:  Colorado
What is the two-letter country code for this unit?
  [Unknown]:  US
Is CN=Jack, OU=Ublu HQ, O=SoftWoehrLLC, L=Fairmount, ST=Colorado, C=US correct?
  [no]:  yes
  
Warning:
The JKS keystore uses a proprietary format. It is recommended to migrate to PKCS12 which is an industry standard format using "keytool -importkeystore -srckeystore ubluserverstore -destkeystore ubluserverstore -deststoretype pkcs12".

java -Djavax.net.ssl.keyStore=/opt/ublu/keystore/ubluserverstore\
  -Djavax.net.ssl.keyStorePassword=aaa4321\
  -Dublu.includepath=/opt/ublu/examples:/opt/ublu/extensions\
  -cp /opt/ublu/ublu.jar:$CLASSPATH\
  ublu.Ublu
  ...
  Type help for help. Type license for license. Type bye to exit.
> server -to @listener -ssl @true -start
>
In another window we can use the openssl command to test our connection.
$ openssl s_client -connect localhost:43860
CONNECTED(00000003)
depth=0 C = US, ST = Colorado, L = Fairmount, O = SoftWoehrLLC, OU = Ublu HQ, CN = Jack
verify error:num=18:self signed certificate
verify return:1
... etc. ....

help
Usage: java ublu.Ublu cmd [arg [arg ..]] [cmd [arg [arg ..]] cmd ..]
        Executes commands left to right.
        If no command is present, interprets input until EOF or the 'bye' command is encountered.

Commands:
        as400           
        ask             
        BREAK           
        bye             
... etc.

Examples using no SSL and connecting via telnet for simplicity:

Ublu main interpretive session
Another shell window
> server -to @listener -start
jwoehr$ telnet localhost 43860
Trying ::1...
Connected to localhost.
Escape character is '^]'.
spoolist mysys qsecofr ******** fred

Ublu:12:UbluInterpreter.Thread[main,5,main]:SEVERE:ublu.util.Interpreter.loop():Command "spoolist" not found.


spoolflist mysys qsecofr ******** fred
XYZ123 85 PROCES FRED 162586 1130628 181030
XYS124 86 PROCES FRED 162586 1130628 181031
XYQ222 86 PROCES FRED 172685 1130712 200655
XYR345 87 PROCES FRED 172685 1130712 200655
...

bye
Connection closed by foreign host.
> server -- @listener -stop

> server -to @listener -block $[ ask -say ${ how are you? }$ -to @foo put -n ${ you are }$  put @foo ]$ -start
$ telnet localhost 43860
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
how are you? : as well as  might be expected
you are as well as  might be expected
Connection closed by foreign host.

session or sess

/0 --,-sess @sess [-to datasink] [-from datasink] [-nt] [[-? ~@${question}$] | [-close] | [-disconnect] | [-dump] | [-getcursor] | [-setcursor x y] | [-send (@)${ send string including [tab] metakeys etc. }$ ]  : interact with a tn5250 session

The session command manipulates a tn5250 session object provided by the tn5250 command. The -sess @sess dash-command must always be present.

The operations are as follows:

sleep

/0 [-m ~@{milliseconds}] [-n ~@{nanoseconds}] Sleep milliseconds (default 0) plus nanoseconds (default 0)

The sleep command sleeps for milliseconds plus nanoseconds. The default for both values, if not set via the -m ~@{milliseconds} and -n ~@{nanoseconds} dash-commands is zero (0).

smapi

/2? [-to @var] ~@host ~@{funcname} [~@{parm} [~{@parm} ..]] : make a smapi call to a host

The smapi command uses the provided PigIron library to execute a z/VM ® SMAPI function on the host designated by ~@host .

The easiest way to make SMAPI calls is to use the functions mapped one-to-one to the supported PigIron SMAPI calls in the .ublu files found in the directory examples/pigiron and included en masse by the file all_pigiron.ublu found in that directory. These functions instance a PigIron output parameter array in the final argument @paramarray with the returns from SMAPI.

Examples of testing are found in examples/pigiron/test/ ... Copy test.defaults.samp to a filename of your choice (say, my.test.defaults), edit the values for your target system, and execute the shell script test_pigiron.sh -f my.test.defaults to see how this all works.

sock

/0 [-to datasink] [--,-sock ~@sock] [-host ~@{host_or_ip_addr}] [-port ~@{portnum}] [-locaddr ~@{local_addr}] [-locport ~@{local_portnum}] [-usessl] [-ssl @tf] [-instance | -close | -avail | -read ~@{count} | -write ~@bytes] : create and manipulate sockets

The sock command creates and manipulate TCP/IP sockets. Instancing the socket connects it. You can then read, write, examine, and close the socket. Operations are as follows:

Example

> put -to @msg HELO
> string -to @msg -nl @msg
> sock -to @s -host www.well.com -port 80
> sock -- @s -write @msg
> sock -- @s -avail
330
> sock -- @s -to @resp -read 330
> string -frombytes @resp
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>301 Moved Permanently</title>
</head><body>
<h1>Moved Permanently</h1>
<p>The document has moved <a href="http://www.well.com/">here</a>.</p>
<hr>
<address>Apache Server at <a href="mailto:webmaster@well.com">www.well.com</a> Port 80</address>
</body></html>
> sock -- @s -close

splfol

/0 [-as400 ~@as400] [-to datasink] [--,-splfol ~@splfol] [-addsort ~@{COPIES_LEFT_TO_PRINT | CURRENT_PAGE | DATE_OPENED | DEVICE_TYPE | FORM_TYPE | JOB_NAME | JOB_NUMBER | JOB_SYSTEM | JOB_USER | NAME | NUMBER | OUTPUT_QUEUE_LIBRARY | OUTPUT_QUEUE_NAME | PRINTER_ASSIGNED | PRINTER_NAME | PRIORITY | SCHEDULE | SIZE | STATUS | TIME_OPENED | TOTAL_PAGES | USER_DATA} @tf | -blocksize ~@{ numentries } | -clearsort | -close | -fdate ~@{sy} ~@{sm} ~@{sd} ~@{ey} ~@{em} ~@{ed} | -fdevs ~@list_of_devs | -fform ~@{formType} | -fjob ~@{jobName} ~@{jobUser} ~@{jobNumber} | -fjobsys ~@{sysname} | -foutq ~@{list_of_ifsOutQs} | -fstat ~@list_of[*CLOSED | *DEFERRED | *SENDING | *FINISHED | *HELD | *MESSAGE | *OPEN | *PENDING | *PRINTER | *READY | *SAVED | *WRITING] | -fudata ~@{userData} | -fusers ~@list_of_users | -format ~@{100 | 200 | 300 | 400} | -get | -getsome ~@{offset} ~@{length} | -length | -new | -open | -qblocksize | -qformat | -qsystem] : open list of the spooled files on system sorted and filtered

The splfol command ("Spooled File Open List", based on the QGY* APIs of the IBM i operating system) returns a filtered, sorted list of spooled file list objects. These objects may be examined and the spooled files they represent may subsquently be fetched from the system via the spoolf command.

In contrast to the spoolflist command which returns a list of spooled files filtered only by a user profile pattern, splfol

The Ublu-coded extension extensions/ux.sfol.ublu provides functions pertaining to the splfol object. They are mostly query methods which could have been coded directly into the splfol command but instead are provided as extensions. There is also a function to set start and end of the creation date filter down to the second, as splfol's -fdate dash-command only accepts the dates and automatically sets the start and end times to 00:00:00.

The Ublu-coded extension extensions/ux.sfli.ublu provides constants and functions pertaining to the list items in the list returned by splfol, including deriving a spoolf object from a list item.

The regimen is

The dash-commands are as follows:

See also: spoolf spoolflist

spoolf

/8? [-as400 ~@as400] [--,-spoolf ~@spoolf] [-to datasink] [-tofile ~@filename ] [-pure ~@tf] [[-answermsg ~@{ some text }] | [-copy] | [-copyto ~@remote_as400] | [-copyq ~@outq] | [-create] | [-delete] | -fetch | [-get createdate | createtime | jobname | jobnumber | jobsysname | jobuser | message | name | number] | [-hold [-immed|-pageend]] | [-new,-instance ] | [-move ~@spoolf_before_me] | [-moveq ~@{outq_on_same_system}] | [-release] | [-sendtcp ~@remotesysname ~@remoteprintqueuepath] [-top] [printerfile ~@printerfile] [-ppl ~@ppl] [-outq ~@outq]] ~@{system} ~@{user} ~@{password} ~@{name} ~@{number} ~@{jobname} ~@{jobuser} ~@{jobnumber} : operate on an individual spooled file

The spoolf command operates on an individual spooled file object. You can create the spooled file object directly from a full specification as shown above, or you can use the -spoolf dash-command to provide a tuple referencing a spooled file object either created directly or obtained via the spoolflist command.

The various operations are as follows:

See also: splfol spoolflist

There are extensions for getting attributes in extensions/ux.printobj.ublu.

Example

> as400 -to @remote localbox qsecofr *******
> as400 -to @local remotebox qsecofr *******
> spoolflist -to @src -as400 @local jwoehr
> put -from @src
QSYSPRT 1 XBXUHP2 JWOEHR 418388 1141009 113633
QPJOBLOG 2 XBXUHP2 JWOEHR 418388 1141009 113650

> FOR @i in @src $[ spoolf -spoolf @i -copyto @remote ]$

QPRINT 1 QPRTJOB QSECOFR 019607 1171008 174859
QPRINT 2 QPRTJOB QSECOFR 019607 1171008 174900

> spoolflist -as400 @remote qsecofr

...
QPRINT 1 QPRTJOB QSECOFR 019607 1171008 174859
QPRINT 2 QPRTJOB QSECOFR 019607 1171008 174900

...

as400 -to @server localbox jwoehr *******>
outq -to @q -as400
@server /QSYS.LIB/QUSRSYS.LIB/P6.OUTQ
ifs -to @data -as400 @server -read  0 100 /home/jwoehr/test.txt
spoolf -as400
@server -from @data -create -outq @q
QPNPSPRTF 1 QPRTJOB JWOEHR 053887 1160629 194531

spoolflist

/4? [-as400 ~@as400] [-to datasink] ~@{system} ~@{userid} ~@{passwd} ~@{spoolfileowner} : fetch a list of the given user's spooled files as objects

The spoolflist command retrieves a collection of spooled file objects from the host. These objects can then be processed in a FOR loop or put to a datasink.

If the -as400 ~@as400 dash-command is used then the  ~@system ~@userid ~@passwd arguments must be omitted and only the ~@spoolfileowner should be provided.

Example

The following deletes all spooled files for user someuser on system sys1.

as400 -to @sys1 sys1 qsecofr ********
spoolflist -to @list -as400 @sys1 someuser
FOR @i in @list $[ spoolf -spoolf @i -delete ]$

See also: splfol

streamf

/0 [-to datasink] [-from datasink] [--,-streamf @streamfileinstance] [ -list | -new ~@{fqp} | -open ~@{mode RB|RC|W} | -close | -create | -delete | -file | -rename ~@streamf | -mkdirs | -rball | -rcall | -rline | -read ~@{offset} ~@{length} | -write ~@{data} ~@{offset} ~@{length} | -q,-query ~@{qstring [af|ap|c|d|e|f|length|n|p|r|w|x]}] : manipulate stream files

The streamf command manipulates local file system stream files and directories.

The model is to create a streamf instance via -new and pass it via --,-streamf @streamfileinstance to subsequent invocations of the streamf command.

streamf instances stored in tuples are autonomic.

Note that streamf as currently implemented may not behave as expected with other than the default charset. File an issue if you have problems in this regard and I will attempt to address it promptly.

The dash-commands are as follows:

Example

> streamf -to @sd -new /usr/bin
> @sd -list
grub-mkfont
sha1sum
diff
pbmtoatk
ciptool
... 

string

/0 [-to datasink] [--,-string ~@{lopr}] [-limit ~@{ntimes}] [-uchar ~@{ 0x???? 0x???? ...} | -bl ~@{string} | -bls ~@{string} ~@{n} | -cat ~@{string1} ~@{string2} | -charat ~@{intoffset} | -eq ~@{string1} ~@{string2} | -escape ~@{string} | -frombytes ~@byte_array | -len ~@{string} | -new | -nl ~@{string} | -pad ~@{string} ~@{fillchar} ~@{fillcount} | -repl ~@{string} ~@{target} ~@{replacement} | -repl1 ~@{string} ~@{target} ~@{replacement} | -replregx ~@{string} ~@{regex} ~@{replacement} | -split ~@{string} ~@{regex} | -startswith ~@{string} ~@{substr} | -substr ~@{string} ~@{intoffset} ~@{intend} | -tobytes ~@{string} | -toas400 ~@as400 ~@{string} ~@{ccsid} | -toascii ~@as400 ~@bytes ~@{ccsid} | -trim ~@{string} | -unescape ~@{string} | -upcase ~@{string} | -lcase ~@{string} ] : string operations

The string command performs string operations and puts the result to the destination datasink. Its operands are provided with the individual dash-commands specifying the operation. Any and all string operands may be

If the eponymous dash-command [--,-string ~@{lopr}] is present, it takes the place of the sole or left operand of the dash-command denoting the operation. This only applies to sole or left operands which have a string value and does not apply to the dash-commands which operate on byte array arguments.

Strings are autonomic.

There are extensions for strings in extensions/ux.string.ublu .

The default operation is -new .

The operations are:

Examples

> string -uchar 0x09 -to @tab
> put -to @a ${ a string and some tabs }$
> string -to @z -cat @a @tab
> string -to @z -cat @z @tab
> string -to @z -cat @z arf_arf
> put @z
a string and some tabs          arf_arf

> string -to @string -unescape ${ This\tis\na\ test \ . }$
> put @string
This is
a test .
> @string -to @newstring -escape
> put @newstring
This\tis\na\ test\ \ .\
> @newstring -unescape
This is
a test .
>
> string -unescape ${ this\Tisatest }$
Ublu:1:UbluInterpreter.Thread[main,5,main]:SEVERE:ublu.util.Interpreter.loop():Command "string" threw exception
character T not allowed for an escape
java.lang.UnsupportedOperationException: character T not allowed for an escape
> string -unescape ${ This is a unicode escape: "\u2665" }$
This is a unicode escape: "♥"

subsys

//3? [-as400 ~@as400] [--,-subsys ~@subsys] [-to datasink] [-subsyspath ~@{subsysIFSpath}] [-authoritystring ~@{authoritystring}] [-timelimit ~@{intval}] [-assignprivate ~@{sequencenumber} ~@{size} ~@{activityLevel} | -assignshared ~@{sequencenumber} ~@{poolname} | -change [description ~@{text} | displayfile ~@{path} | languagelibrary ~@{lib}} | maxactivejobs ~@${int}] | -create | -delete | -end | -endall | -new,-instance | -list | -query [description | activejobs | displayfilepath | languagelibrary | library | maxactivejobs | monitorjob | name | objectdescription | path | pool | pools ~@{sequencenumber} | status | system] | -refresh | -remove ~@{sequencenumber} | -start ] system userid password : manipulate subsystems

The subsys command manipulates OS/400 subsystems. Use the -instance dash-command to instance the subsystem specifying the system via arguments or via -as400 @as400 and the IFS path to the subsystem description via -subsyspath ~@{subsysIFSpath}. Store the resultant object in a tuple variable and use it with the -subsys ~@subsys dash-command to perform operations on the subsystem.

If the -as400 dash-command is provided, then the arguments system userid and password must be omitted.

If the -subsys ~@subsys dash-command is provided, then the  -as400 dash-command and the arguments system userid and password must be omitted.

The operations are as follows:

SWITCH

 ~@stringselector [-case ~@{string} $[ block ]$ [[-case ~@{string} $[ block ]$] ...] [-default $[ block ]$] : language switch statement

SWITCH provided with a string selector proceeds to walk thru a series of -case dash-commands looking for a match. If the match is found, the execution block referenced by the -case is executed.

Each -case is accompanied by a case string and a block. If the case string matches the selector, the case is taken, its block is executed, and no other -case or -default will be taken.

Each -default is accompanied only by a block.  If a -default dash-command is encountered before a -case matches, the -default is executed and no following -case will match.

The -case dash-commands and the -default, if any, can be in any order and are processed in the order they appear. The -default once encountered is always taken if no match has been found prior and no other -case or -default will be taken.

Nothing prevents one -case dash-command from having the same case string as another -case .

SWITCH is only easy to use inside another block (e.g, in a FUNC) because a block is parsed as single line, allowing the -case dash-commands to appear on individual lines, indented. Interactively entered, all the -case dash-commands must appear on one line.

Example(examples/test/testswitch.ublu)
   1 # testswitch.ublu
   2 # test SWITCH statement
   3 #
   4 # Example session:
   5 # > put -to @sel red
   6 # > testSwitch ( @sel )
   7 # you chose red 
   8 # > put -to @sel green 
   9 # > testSwitch ( @sel )
  10 # you chose green 
  11 # > put -to @sel blue
  12 # > testSwitch ( @sel )
  13 # What!?  blue
  14 
  15 FUNC testSwitch ( selector ) $[
  16     SWITCH @@selector
  17         -case red
  18             $[ put ${ you chose red }$ ]$
  19         -case green
  20             $[ put ${ you chose green }$ ]$
  21         -default
  22             $[ put -s -n ${ What!? }$ put @@selector ]$
  23 ]$
  24 # End
  25 

system

/1 [-to datasink] [-from datasink] [-s] ~@{ system command } : execute a system command

Execute a system command.

The command string can be provided as a quoted string, tuple variable, pop from the tuple stack, or from a file if so indicated by the -from dash-command.

If input is from a file, only the first line of the file is read.

The result of the command is a process closure which is put to a datasink specified by the -to dash-command. If the destination datasink is standard out (STD:, the default), the textual result of the command and its return code are displayed.

If the datasink is a tuple, the process closure itself is put to the tuple. The output of the command and its return code can be obtained from such a tuple @pc

respectively.

If the destination is a tuple, an info logging line is put indicating the return code of the command execution unless the -s dash-command ("silent") was provided.

Note: system is only good for one-time commands, not interactive programs such as telnet.

Example

> system ${ cat syscmd.txt }$
ls -al /Users/jwoehr

return code: 0
> system -from syscmd.txt
total 240
drwxr-xr-x+ 46 jwoehr  staff   1564 Aug  1 00:44 .
drwxr-xr-x   6 root    admin    204 Jun 11 09:30 ..
-rw-------   1 jwoehr  staff      3 Jun 11 09:30 .CFUserTextEncoding
-rw-r--r--@  1 jwoehr  staff  21508 Jul 24 22:15 .DS_Store
drwx------   2 jwoehr  staff     68 Jul 29 10:49 .Trash
... etc.
return code: 0
> put -to @myCommand -from syscmd.txt
> system -from @myCommand -to @myOutput
Ublu:1:UbluInterpreter.Thread[main,5,main]:INFO:ublu.command.CmdSystem.executeCommand():return code: 0
> put -from @myOutput
total 240
drwxr-xr-x+ 46 jwoehr  staff   1564 Aug  1 00:44 .
drwxr-xr-x   6 root    admin    204 Jun 11 09:30 ..
-rw-------   1 jwoehr  staff      3 Jun 11 09:30 .CFUserTextEncoding
-rw-r--r--@  1 jwoehr  staff  21508 Jul 24 22:15 .DS_Store
drwx------   2 jwoehr  staff     68 Jul 29 10:49 .Trash
... etc.
return code: 0
> calljava -method getRc -obj @myOutput
0

sysval

/0 [-as400 ~@as400] [-to datasink] [--,-sysval ~@sysval] [[-new,-instance alc|all|dattim|edt|libl|msg|net|sec|stg|sysctl] | [haskey ~@{ key }] | [-value ~@{ key }] | -set ~@{ key } ~@value] | [-systemvalue] | [-list] | [-map]] : retrieve a system value list

The sysval command gets and sets system values. When instancing, an as400 instance must be provided via the -as400 dash-command. Thereafter the instance is passed to sysval for the other operations via the -sysval dash-command.

There is no default operation. The operations are as follows:

Example

as400 -to @a mysystem myuserid ********
sysval -as400 @a -to @svl -instance all
sysval -- @svl -to @keys -keys
FOR @i in @keys $[ LOCAL @myval put -n -from @i -s put -n ${ : }$ sysval -- @svl -value @i -to @myval put -from @myval ]$
QMODEL : E4D
QDAYOFWEEK : *WED
ALRBCKFP : [*NONE   ]
MAXINTSSN : 200
DDMACC : *OBJAUT
QMINUTE : 48

TASK

/1? [-from ~@datasink] [-to ~@datasink] [-local @tuplename ~@tuple [-local ..]] [-start] $[ BLOCK TO EXECUTE ]$ : create a background thread to execute a block, putting the thread and starting the thread if specified

The TASK command creates a thread to execute a block and optionally launches the thread immediately. TASK puts the thread object it creates, which can be stored in the destination datasink via the -to dash-command for later manipulation by the thread command which can start and stop the thread.

Instead of providing a block as an argument to the TASK command, you may provide a series of commands stored as a string in a file or tuple variable indicated as the source datasink via the -from dash-command.

The new thread inherits the current tuple variable space but will not possess new entries added to other threads subsequent to the launch of this TASK's thread.

-local @tuplename ~@tuple may be used any number of times to assign the value of ~@tuple locally to the thread under .

Best is to use the tuple stack when providing a tuple value for the task-local thus created.

Thus,

# TASK to exit Ublu after a number of milliseconds
FUNC killme ( ms ) $[
    TASK -to NULL: -local @ms @@ms -start $[
        put -n ${ value of @ms is }$ put @ms
        sleep -m @ms exit
    ]$
]$

# This works
FUNC killinseconds ( seconds ) $[ eval -to ~ * @@seconds 1000 killme ( ~ ) ]$

# This works too
FUNC killinseconds2 ( seconds ) $[ LOCAL @ms eval -to @ms * @@seconds 1000 killme ( @ms ) ]$

The new thread starts with the system dictionary. Any additional functions must be added by include or dict -restore.

The -start dash-command starts the thread immediately. The default is to simply put the thread to be later started via the thread command.

test

/0 [-to @var] [[[-eq | -ne] @var|lex @var|lex] | [-null @var ] | | [-nnull @var ] [-jcls @var full.javaclass.name]]: compare and return boolean result
test performs object and string tests and comparisons. It can be used to set a tuple variable for use by conditionals such as the IF command or to put the result to default datasink.

Each of the two objects of the following binary tests
are either the value of a @tuplevar or a single lex (single word). The following test will return true:
put -to @foo 7
test -eq 7 @foo
The other tests are unary and their object is always a @tuplevar.

The operations known to test each return true if the test succeeds, false otherwise, and are as follows:

If the operands to -test are Strings, the Strings are first trimmed of leading and trailing whitespace.

test is primarily for string and object comparison. Use eval for arithmetic and logical operations.

To test for the existence of a tuple variable, use tuple -exists @varname

See the IF  and DO commands for examples.

thread

/0 [-from datasink] [-to datasink ] [--,-thread ~@thread] -new,-instance | -start | -stop : interpret in a background thread

The thread command creates and/or manipulates threads of execution.  When the thread command instances a thread, it puts the thread object it creates, which can be stored in the destination datasink via the -to dash-command for later manipulation. The thread command can also manipulate threads created by the TASK command.

The operations are as follows:
The thread object will display information about itself if put to standard output.

THROW

/1  ~@{log string} : THROW from a TRY block

THROW creates an error condition and a log message.

The error condition will cause exit from a TRY block and entry to the CATCH block.

If there is no enclosing TRY block the THROW will return to the interpreter and the error condition will be reported by the interpreter.

Example

FUNC testTry ( ) $[
    TRY $[
        put ${ outer try block }$
        TRY $[
            put ${ inner TRY block }$
            THROW oops
        ]$ CATCH $[
            put ${ caught the oops }$
        ]$
        put ${ continuing outer TRY }$
    ]$ CATCH $[
        put ${ caught error in outer try block }$
    ]$
]$
testTry ( )
outer try block
inner TRY block
Ublu:1:UbluInterpreter.Thread[main,5,main]:SEVERE:ublu.command.CmdThrow.doThrow():oops
caught the oops
continuing outer TRY

tn5250

/0 [-tn5250 @tn5250] [-to datasink] [[-new,-instance] | [-my5250] | [-run] | [-session] | [-sessionlist]]  [-args ~@${ arg string }$] ~@system : instance a programmable or interactive tn5250j
   
tn5250j is an open source terminal emulator coded in Java for 5250 display terminal emulator over TCP/IP. This command creates and manipulates instances of tn5250j.

If the -tn5250 @tn5250 dash-command is provided, the ~@system argument must be omitted. Some operations require the -tn5250 @tn5250 dash-command.

Note that at present, when a running tn5250j instance exits, it calls System.exit() shutting down AbsPerfOS400.

The operations are as follows:

trace

/0   [-tofile ~@{filename}] [-on] [-off] [-set ~@{all|conversion|datastream|diagnostic|error|info|jdbc|pcml|proxy|thread|warning} ~@{on|off}]: set JTOpen tracing

The trace command controls JTOpen tracing. Diagnostic info is streamed to standard out or to a file, as controlled by various means including JTOpen properties files. A full discussion of the configuration and usage of JTOpen tracing is beyond the scope of this document. Consult the JTOpen javadocs for com.ibm.as400.access.Trace ... currently com.ibm.jtopenlite.Trace is not supported by Ublu.

The category settings persist until explicitly changed by the trace command even when tracing is deactivated by -off.

Example

# activate all categories of tracing and turn tracing on
trace -set all on -on
# deactivate tracing without changing categories for the next run
trace -off

TRY

/3 $[ try block ]$ CATCH $[ catch block ]$ : TRY and CATCH on error or THROW

TRY ... CATCH provides a simple error-handling mechanism. If the system encounters an error during execution of the TRY block then the CATCH block is entered. Otherwise, execution continues after the CATCH block.

Ublu itself catches Java exceptions and a large number of error conditions. Any of these conditions or exceptions will cause entry to the CATCH block.

No exception classes are provided and no bookkeeping is performed. Your program itself must place intermediate results in tuple variables to be tested in the CATCH block to determine what happened.

TRY .. CATCH can be nested to any practical depth.

You can also THROW an error in your program.

Example

as400 -to @as400 badname baduser badpasswd
> TRY $[ put -to @foo ${ not connected }$ as400 -- @as400 -connectsvc CENTRAL put -to @foo ${ now connected }$ ]$ CATCH $[ put -from @foo ]$ ...
Ublu:1:UbluInterpreter.Thread[main,5,main]:SEVERE:ublu.command.CmdAS400.as400():Exception connecting to service in as400/3? ...
java.net.UnknownHostException: badname
    at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:178)
    ...
not connected

tuple

/0 [-assign ~@targetname ~@valuesource | -delete @tuplename | -exists @tuplename | -istuplename @tuplename | -null @tuplename | -true @tuplename | -false @tuplename | -name @tuplename | -realname @tuplename | -value ~@tuplename | -sub @subname ~@tuple | -type ~@tuple | -typename ~@tuple | -map | -autonome ~@tuple | -autonomic ~@tuple | -autonomes ] : operations on tuple variables

Operations on tuple variables.

Example

tuple -true @woof
IF @woof THEN $[ put ${ woof is true }$ ]$ ELSE $[ put ${ woof is false }$ ]$
woof is true
tuple -false @woof
IF @woof THEN $[ put ${ woof is true }$ ]$ ELSE $[ put ${ woof is false }$ ]$
woof is false

usage

See help.

user

/3? [-as400 @as400] [--,-user ~@user] [-to datasink]  [-userprofile ~@{username}] [-enable | -disable | -new,-instance | -query ~@{property} | -refresh ] system userid password : manipulate user profile

The user command instances a User object to manipulate a user profile. If the -as400 dash-command is provided, then the arguments system userid and password must be omitted. If an instanced User object is suppled via the -user ~@user dash-command then the -userprofile dash-command should be omitted.

One way to get one or more User objects is via the userlist command.

The operations are as follows:

The default operation in the absence of any operation dash-command is -instance.

There are extensions for this command coded in Ublu in extensions/ux.user.ublu

userlist

/3? [-as400 ~@as400] [-to datasink] [-userinfo ~@{ALL|USER|GROUP|MEMBER}] [-groupinfo ~@{NONE|NOGROUP|profilename}] [-userprofile ~@{username|*ALL}] ~@{system} ~@{userid} ~@{password} : return a list of users
   


The userlist command puts a UserList object to the destination datasink. If the -as400 dash-command is provided, then the arguments system userid and password must be omitted.

The UserList can be iterated for its individual User objects.

Example

FUNC testUserList ( as400 ) $[
    LOCAL @users
    userlist -to @users -as400 @@as400
    FOR @user in @users $[ user -- @user -query name ]$
]$

as400 -to @as400 mysys myuser mypassword
testUserList ( @as400 )
FRED
JIM
SUZY
QFOO
...

See also: user

watson

/0 [-to datasink] [-h ~@{host}] -s ~@{service-url-part} [-p ~@{watson-parameter [p ~@{watson-parameter ..]] [-m ~@{http_method}] [-req_type ~@{request_content_type}] [-resp_type ~@{response_content_type}] [-req ~@{request_content}] : invoke IBM Watson microservice

The watson command contacts a URL offering IBM Watson ® microservices. By default if no host is specified, the command chooses an IBM public URL offering Watson microservices.

Only HTTPS GET and POST are currently supported.

The dash-commands are as follows:

Examples:

watson -s language-translator/api/v2/translate -p text=hello -p source=en -p target=fr
{
  "translations" : [ {
    "translation" : "Bonjour"
  } ],
  "word_count" : 1,
  "character_count" : 5
}
watson -s language-translator/api/v2/translate -m POST -req ${ { "text": [ "elephant" ], "source": "en", "target": "fr" } }$
{
  "translations" : [ {
    "translation" : "L'éléphant"
  } ],
  "word_count" : 1,
  "character_count" : 8
}

WHILE

/4 ~@whiletrue $[ cmd .. ]$ : WHILE @whiletrue iterate over block

WHILE iterates over its block until the tuple (provided by name on the command line or popped from the tuple stack using the ~ symbol) @whiletrue returns false or a BREAK is encountered.

Example

tuple -true @continue
lifo -push @continue
put -to @index -# -3
WHILE ~ $[
        LOCAL @i
        eval -to @index + @index 1
        put -to @i -from @index
        lifo -push @i
        put -s -n ${ index is }$ put -from ~
        eval -to @exit == @index 7
        IF @exit THEN $[ BREAK ]$
]$
put -n -s ${ after WHILE, the index is }$ put -from @index
put -n -s ${ tuple stack depth is }$ lifo -depth

# - or - #!

introduces a comment which continues to the end of the input line, either in interpretive mode or in reading a file via the include command. Like all commands, it must be separated by at least one space from any following non-whitespace in order to be recognized.

Note that line comments introduced by # should not be used inside an execution block, which is treated as a single line so the comment character devours everything to the end of the block. Inside of functions, use the \\ command instead.

The form #! is used as the first line of an executable shell script. The interpreter in such a shell script should be indicated as follows:

#! /fully/qualified/path/to/java -jar /fully/qualified/path/to/ublu.jar include
(Note the blank space after the #! and before the path to java.)

This assumes that ublu.jar resides in a directory containing a lib subdirectory with the several support .jar files used by Ublu (see Invocation).

Note that not all versions of the shell support the invocation syntax just as shown, i.e., it works on Mac Darwin but doesn't on some Linux shells.

The script should be chmod u(go)+x and at runtime, the include directive at the end of the line ends up preceding the name of the script as provided by the shell, thus loading Ublu and executing the command include filename .

Example

$ cat hello.ublu
#! /usr/bin/java -jar /Users/jwoehr/NetBeansProjects/Ublu/dist/ublu.jar include
put ${ Hello, world! }$
$ chmod u+x hello.ublu
$ ./hello.ublu
:: #! /usr/bin/java -jar /Users/jwoehr/NetBeansProjects/Ublu/dist/ublu.jar include
:: put ${ Hello, world! }$
Hello, world!
::

!

/1 history_linenumber
The "bang" command is a convenient shortcut for history -do used to repeat a previous line. History must be on for this command to work.
! 386
is short for
history -do 386
Note that, unlike shell history, a space must separate the "bang" and the history linenumber.
If ! appears with the argument ! then the most recent command which was not a ! is repeated.
Note that if the above ! ! abbreviation appears embedded in a recalled history line to be executed by ! ! , the command line fails and issues an error message for recursive usage.

\\

/1 [-to datasink] ${ some text, usually a comment }$  : redirect a quotation to a datasink or simply make an enclosed comment
The \\ ("comment-quote") command allows you to use a ${ quoted string }$ or single plainword for an enclosed comment. This is useful in defining functions (e.g., with FUNC) because the line comment command # can't be used in a function definition's execution block since it has the effect of consuming the entire remainder of the block.

The string can actually be redirected to a datasink by the -to dash-command, if desired. The default datasink for \\ is set to NULL:

Example (examples/test/testcommentquote.ublu)
   1 # test comment quote
   2 # Example from Ublu https://github.com/jwoehr/ublu
   3 # Copyright (C) 2016 Jack J. Woehr http://www.softwoehr.com
   4 # See the Ublu license (BSD-2 open source)
   5 
   6 FUNC cq ( param ) $[
   7   \\ -to @foo zog  
   8   put -from @@param
   9   \\ -to @bar ${ this is an extended comment }$
  10   put -from @@param
  11   \\ ${ this is also an extended comment }$
  12   put -from @@param
  13   put ${ done function }$
  14 ]$
  15 put -to @myparam ${ a test param }$
  16 cq ( @myparam )
  17 put -from @foo
  18 put -from @bar
  19 # done
  20 

Tips and Tricks

Expert Tool

Ublu is an expert tool for expert users. It is assumed the user is capable of shell scripting and shell one-liners. Examples are given below.

Shortening invocation

An easy way to make invoking Ublu involve less typing is to create a function in .bash_profile, e.g.,

function ublu () {
    java -jar /opt/ublu/ublu.jar $*
    }

Thereafter you can simply type ublu arg arg ... instead of java -jar /opt/ublu/ublu.jar arg arg ...

Making your Ublu script an executable shell script

See the  #! command example.

Recording a session

Either interpretive or simple command-line sessions can be recorded via shell i/o redirection.

java -jar ublu.jar command args command args 2>&1 | tee mysession.txt

Piping input to Ublu

You can pipe input into Ublu:
echo "include somefile.ublu" | java -jar ublu.jar
When the input runs out from the redirection, Ublu will exit normally.

Note that the interpret command will simply exit because there is no extra input for a nested interpeter session to be had when input is redirected in this fashion.

Delete all spool files created before 2016 for all users

This simple example can be easily modified to operate on the files for a given user, or for another date. The key concept here is that AS400 dates consist of

1-digit century (0=20,1=21) | 2-digit year | 2-digit month | 2-digit date
   1 as400 -to @mysys mysys myid mypasswd
   2 spoolflist -to @l -as400 @mysys *ALL
   3 list -to @ll -source @l
   4 FOR @i in @ll $[
   5     LOCAL @date LOCAL @tf 
   6     @i -to @date -get createdate
   7     eval -to @tf < @date 1160000
   8     IF @tf THEN $[ @i -delete ]$
   9 ]$

Defluff a CSV

Currently ASCII translation of EBCDIC character data is inserted intercolumnarly in the output of db -csv if the db type is AS400 and the column datatype is BINARY. Want to dump the hex output and just keep the character columns?

java -jar ublu.jar db -db as400 -csv SOMETABLE somehost someschema someuser ******** | awk 'BEGIN {FS=","} {printf("%s,%s,%s\n",$2,$4,$6)}' >SOMETABLE.txt

Splitting a result set from an extant table into a new table

The following program creates a new Postgres table, grabs a result set from an extant DB400 table and inserts it into the new Postgres table splitting one column of the extant table into several columns. In the example, the original table has three (3) columns, the last of which actually contains several different data entities all stuffed into the one column.

rssplitapavend.ublu

db -db postgres -query_nors ${ create table ap_avend (active bytea, id bytea, name bytea, addr_1 bytea, addr_2 bytea, addr_3 bytea, city bytea, state bytea, weird_bitmap bytea, buncha_ns bytea, buncha_nonsense bytea, primary key (active,id,name,addr_1,addr_2,addr_3,city,state,weird_bitmap,buncha_ns,buncha_nonsense)) }$ onesys test fred ********
db -db as400 -star ARF.WOOF -to @srctable anothersys farfel fred ********
db -db postgres -star ap_avend -to @desttable onesys test fred ********
rs -split 3 9 35 35 35 35 20 2 44 9 94 -toascii 3 1 2 3 -from @srctable -to @desttable
rs -close @srctable
rs -close @desttable

Replicating a table and copying the data

To replicate table structure and copy in the data, the steps are:

  1. Replicate the empty table structure.
  2. Get a result set from the source table.
  3. Get an (empty) result set from the destination table.
  4. Use the rs command to copy the source result set to the destination result set.
  5. Close the result sets.

Two examples follow, one where the autogenerator generates the primary keys, and the second one where the primary keys are provided to the command.

testreplicate.ublu - the primary keys are automatically generated

#! /usr/bin/java -jar /Users/jwoehr/NetBeansProjects/Ublu/dist/ublu.jar include
db -db as400 -replicate ARF.WOOF onesys postgres test fred ********* testsojac farfel fred *********
db -db as400 -query ${ SELECT * FROM ARF.WOOF }$ -to @srctable anothersys farfel fred *********
db -db postgres -query ${ SELECT * FROM ARF.WOOF }$ -to @desttable onesys test fred *********
rs -insert -from @srctable -to @desttable
rs -closedb @desttable
rs -closedb @srctable
put ${ Done replication and data copy. }$

Here is the above script running:

$ ./testreplicate.ublu
:: #! /usr/bin/java -jar /Users/jwoehr/NetBeansProjects/Ublu/dist/ublu.jar include
:: db -db as400 -replicate ARF.WOOF sojac postgres test fred ********* testsojac farfel fred *********
Aug 04, 2013 5:57:46 PM ublu.db.TableReplicator replicate
INFO: Table creation SQL is CREATE TABLE ARF.WOOF ( F00001 varchar NOT NULL, K00001 varchar NOT NULL, F00002 varchar NOT NULL, PRIMARY KEY (F00001, K00001, F00002) )
:: db -db as400 -query ${ SELECT * FROM ARF.WOOF }$ -to @srctable testsojac farfel fred *********
:: db -db postgres -query ${ SELECT * FROM ARF.WOOF }$ -to @desttable sojac test fred *********
:: rs -insert -from @srctable -to @desttable
:: rs -closedb @desttable
:: rs -closedb @srctable
:: put ${ Done replication and data copy. }$
Done replication and data copy.
$

Even simpler than the above is to replace the two query lines above with the following lines:

db -db as400 -star ARF.WOOF -to @srctable testsojac farfel fred *********
db -db postgres -star ARF.WOOF -to @desttable sojac test fred *********

using db -star to formulate automatically the SELECT * FROM query.

Afterwards of course you can view your handiwork:

java -jar ublu.jar db -db postgres -star ARF.WOOF sojac test fred ********

replicate - the primary keys are provided to the command

Here is an example of replication when it is desired to provide manually the primary key list to table replication

db -db as400 -pklist ${ FOO BAR WOOF ARF ZOTZ }$ -replicate SNORK 1.2.3.4 postgres test fred ******** 5.6.7.8 farfel fred ********
db -db as400 -property ${ block size }$ 512 -star PPPOLICY -to @srctable 5.6.7.8 farfel fred ********
db -db postgres -star SNORK -to @desttable 1.2.3.4 test fred ********
rs -insert -from @srctable -to @desttable
rs -closedb @desttable
rs -closedb @srctable

Saving history as a program

You can save command history as a program, either temporarily in a tuple variable or by writing history lines to a file.
> history -range 214 215
put ${ this is a first line }$
put ${ this is a second line }$

> history -range 214 215 -to @myProgram
> put -from @myProgram
put ${ this is a first line }$
put ${ this is a second line }$

> include -from @myProgram
:: put ${ this is a first line }$
this is a first line
:: put ${ this is a second line }$
this is a second line
> put -from @myProgram -to myProgram.ublu
> put -from myProgram.ublu
put ${ this is a first line }$
put ${ this is a second line }$

> include myProgram.ublu
:: put ${ this is a first line }$
this is a first line
:: put ${ this is a second line }$
this is a second line
>

Searching for tables in the IBM i database

Submitted by Kevin McGinnis

To obtain info on all tables in the IBM i db use the command:
db -db as400 -query ${ select * from SYSTABLES }$ myhost QSYS2 qsecofr ********
To obtain a result set listing all the tables in a library SOMELIB the command can be modified to:
db -db as400 -query ${ select * from SYSTABLES where SYSTEM_TABLE_SCHEMA = 'SOMELIB' }$ myhost QSYS2 qsecofr *********
To obtain an ordered list of the tables in a library SOMELIB the command can be modified to:
db -db as400 -query ${ select TABLE_NAME from SYSTABLES where SYSTEM_TABLE_SCHEMA = 'SOMELIB' order by 1 }$ myhost QSYS2 qsecofr xx

Get a list of active interactive jobs and search that list for specific jobs

as400 -to @as400 MYHOST qsecofr ********
joblist -to @joblist -jobtype INTERACTIVE -active -as400 @as400
put -to @namelist ${ THISUSER THATUSER ANOTHER FRED }$
FOR @j @joblist $[
    job -job @j -get user -to @user
    FOR @name in @namelist $[
        test -to @match -eq @name @user
        IF @match THEN $[
           put ${ !! }$ put -n ${ We found user }$ put -from @user
           ]$ ELSE $[
           put -n ${ . }$
           ]$
        ]$
]$

Syntax coloring edit modes

There are two syntax coloring edit modes provided with Ublu:

which should be self-explanatory to users of the Vim and/or jEdit editors.

Appendix A: Printer Attributes

The following list of printer attributes are used with printer -get. Not all attributes will be recognized for all printers.

End of document