Tuesday, December 20, 2016

Frequently Used DB2 Functions and Expressions

Get the number of days since 0001-01-01
SELECT DAYS('2016-12-25')

Convert date string to datetime type
SELECT date(to_date(string_date_column,'YYYY-MM-DD'))

Calculate difference in number of days between two dates
SELECT DAYS('2016-01-01') - DAYS('2015-01-01')

Get day of the month, month, and year
SELECT DAY('2016-12-25'), MONTH('2016-12-25'), YEAR('2016-12-25')

From DB2 v10, dates can be formatted with a desired format string
SELECT VARCHAR_FORMAT(date-column, 'YYYY-MM-DD HH:MI:SS')

Tuesday, December 6, 2016

Frequent Regular Expressions

Here are some frequently used regular expressions.

IP Address:
^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$

Social Security Number:
^\d{3}-?\d{2}-?\d{4}$

Credit Card: [Visa, MC, AmEx, and Discover]
^(?:4[0-9]{12}(?:[0-9]{3})?|5[1-5][0-9]{14}|6(?:011|5[0-9][0-9])[0-9]{12}|3[47][0-9]{13}|3(?:0[0-5]|[68][0-9])[0-9]{11}|(?:2131|1800|35\d{3})\d{11})$

Use Luhn algorithm to check if a credit card number is a valid number. Of course, the algorithm doesn't obviously check if the card is actually any good. Such must be checked with the card issuer.


Monday, November 28, 2016

Renting a Car in South Korea

Introduction:
I often traveled to South Korea. My trip was the first one for me to rent a car. Because the public transportation in Korea is ubiquitous, it's not really necessary to rental a car, especially when your destinations are in major cities like Seoul and Busan. Since there are many sites discussing how vast public transportation is in Korea, I wouldn't cover it here.

I tried to search for some information as what it's like to rent a car in Korea but couldn't find any information. Hence, it prompted me to write this blog.

International Driving Permit:
Before you leave, you must obtain an International Driving Permit. I obtained mine from a local AAA office. It's nothing more than a booklet with the information off of your own driver's license and your picture. AAA charges $20 (in 2016) for it and it is good for only one year. The valid duration is not determined by AAA but the international body that established International Driving Permit system. You can go to the AAA web site to obtain the application. Unless you want to pay extra to AAA, you may want to get your photo taken like the one you would for a passport.
Related image

Rental Companies:
There may be many companies offering rental cars in Korea; however, I tried only AVIS. The price was very reasonable. Note that AVIS is affiliated with AJ Rent-a-Car in Korea. Asking about AVIS at the airport isn't going to work too well as the folks there don't know AVIS. If you're renting a car from Hertz, you will find that Hertz is affiliated with Lotte Rent-a-Car. At the Incheon Airport, the rental car counters are located at both ends of the arrival terminal. However, you may want to go to the one near gate A instead. The one at the opposite end isn't always manned.

Rental Contract:
Note that the folks at the rental car counter will not speak English as fluently as you may wish. Please be specific as how you ask questions. Speak slowly whenever possible. You will be asked to present your passport and International Driving Permit. Your own driver's license is not much of use here. Without an International Driving Permit, you will NOT be able to rent a car. When you rent a car, you will be asked if you wish get a GPS (Koreans call it "Navi" pronounced more like "nebi" and not GPS). If you do not speak Korean, you want to make sure you get the English version. Without a GPS, it will be quite difficult to get around as the Korean road system has been evolving significantly. Also, Koreans are not really into telling you street or highway names but directions like turn left/right. Also, Korea uses the metric system. You should get some idea as what 100 meters is like.

My AVIS rental included a basic collision insurance that covers up to 300,000 wons, which is slightly less than $300. If you feel you need more, you should ask for a higher coverage. One other thing to note is that if you get into an accident and the damage takes few days to repair, you are directly liable for the days the car is out of service at the half of the daily rental rate. For example, if your daily rental rate is $20 and the two days are needed to repair the damage, you owe $20 ($20 * 50% * 2 days).

Driving:
Driving around is fairly easy. Nothing too tricky. The signs are posted both in Korean and English.
The speed limit is 80 to 100 kph on most of the highways. You should try and drive at the posted speed limit. All speedsters are caught by the traffic cameras. GPS usually warns of the upcoming traffic cameras. You should pay attention to what GPS is informing. A typical speed limit sign is:
There are many toll highways. You need to prepare a lot of money in Korean currency to pay the tolls.

On highways, often the left most lane is for buses. If you see a blue line for the left most lane, you don't want to drive in it. You will get a ticket. This situation is the only time when police actually pulls you over.

When you make a left turn, wait for a left arrow. You cannot turn left when it's green unless a sign is posted specifically to allow it. The signs are, unfortunately, only in Korean, which reads:
Right turn on red is allowed when it's safe to do so.

If you see diamond sign painted on the street, they simply mean that a traffic signal is ahead and not carpool lanes.

There are many tunnels in Korea since the 70% of Korea is mountainous. Many bridges and tunnels make the highways relatively flat.

Use your common sense to decide what to do in most cases. The traffic rules and signs are fairly logical.

Accessing DB2 Mainframe (z/OS) from PHP

To access DB2 on Mainframe (z/OS), no special DB2 library for PHP is required. However, a DB2 ODBC driver must be downloaded from IBM and installed. A driver may be downloaded from the IBM web site:

http://www-933.ibm.com/support/fixcentrl/swg/downloadFixes

For example, "v9.,7fp11_ntx64_odbc_cli.zip" may be used.

After downloading it, unzip it in C:\Program Files\IBM.

Once unzipped, execute the following on the command line as the system administrator:

C:> cd "\Program Files\IBM\clidriver\bin"
C:> db2cli install -setup

Use the Administrative tool "Data Sources (ODBC)" to ensure that the driver is installed properly. If not, the command can be executed again. Since db2cli is not very verbose, it may be necessary to do it again. Also, note that if you run it as a regular user, it will display the success message but the installation does not occur.

Ensure that all components are using the same bit-level. For 32-bit PHP, all drivers must be in 32-bit. For 64-bit, all drivers must be in 64-bit. Also, ensure that php.ini contains "extension=php_odbc.dll" is included for Windows.

Once the installation is done successfully, you can then connect to DB2 using odbc_connect function as shown here:

<?php
$conn = odbc_connect('Driver={IBM DB2 ODBC DRIVER - C_PROGRA~1_IBM_CLIDRI~1};'
      . 'Database=database;Hostname=hostname;Port=port;'
      , 'username', 'password');
if ($conn === False) {
    die("Failed to connect: " . odbc_errormsg() . "\n");
}
$res = odbc_exec($conn, 'SELECT 9 AS NUM FROM SYSIBM.SYSDUMMY1');
if ($res === False) {
    echo "Exec failed: " . odbc_errormsg() . "\n";
}
?>
Number of fields: <?php echo odbc_num_fields($res); ?>  
Field name: <?php echo odbc_field_name($res, 1); ?>   
Fetching...
<?php
odbc_fetch_row($res);
$val = odbc_result($res, 1);
?>

From here, all of the ODBC functions may be used to interact with DB2.

Thursday, August 11, 2016

Invoking External Commands within PowerShell

There are several ways to execute commands within PowerShell. One can use Invoke-Command cmdlet and the "&" call operator. But these cannot easily be used if the commands are built dynamically. Suppose that you are trying to gather up file names matching certain criteria with Get-ChildItem. Then you want to pass the collected names to another command depending on certain options.

Once you have the file list collected into an array, it can be passed as arguments to a command. For example:

$fileList = @()
Get-ChildItem -path "some-path" | ForEach-Object {
    if ($_.Length -gt 10MB) {
        $fileList += $_.FullName
    }
}
# now execute an external command
& zip.exe output.zip $fileList

A very nice bonus of using this method is that when $fileList is passed to the call operator, $fileList is expanded with spaces between items and quotes around the items that contain one or more spaces. For example, if $fileList contains two items

C:\Program Files\Paint\Paint.exe
D:\Temp\abc.txt

$fileList is going to be expanded as:

"C:\Program Files\Paint\Paint.exe" D:\Temp\abc.txt

The other options to use is Invoke-Expression cmdlet. However, it's notoriously difficult to build strings from an array to have quotes around the items that contain spaces.

Monday, August 8, 2016

Sending HTML Email from Perl

Sending email from perl is relatively an easy exercise. To send text email, you just need to use Net::SMTP like the following:

use Net::SMTP;

sub sendmail($$@)
{
    my $subj = shift;
    my $msg  = shift;
    my $smtp = Net::SMTP->new($o_smtpsvr);
    $subj = "Subject: $subj\n" unless $subj =~ /^Subject: /;
    $smtp->hello('MyDomain');
    $smtp->reset();
    $smtp->help('');
    $smtp->mail('noreply@xyz.com');
    $smtp->to(@_);
    $smtp->data($subj . $msg);
    $smtp->dataend();
    $smtp->quit;
}

sendmail('hello, world', 'my message', ('abc@xyz.com'));


If you wish to send HTML email instead, it gets a bit more complicated. The reason is that the message must be formatted in MIME. To do so, it's easier to use Email::MIME module from CPAN. Note that Email:MIME is not included with the standard perl distribution while Net::SMTP is.

use Email:MIME;
use Net::SMTP;

my $email_body =<<EOM;
<html>
<body>
Hello, world!
</body>
</html>
EOM

my $email = Email::MIME->create(
    header_str => [
         To      => $email_to,
         Cc      => $email_cc,
         From    => $email_from,
         Subject => $email_subject
    ],
    body_str   => $email_body,
    attributes => {
         content_type => 'text/html',
         charset => 'utf8',
         encoding => 'quoted-printable'
    }
);

my $smtp = Net::SMTP->new($SmtpHostName);
$smtp->hello('MyDomain');
$smtp->reset();
$smtp->help('');
$smtp->mail($email_from);
$smtp->to($email_to);
$smtp->data($email->as_string);
$smtp->dataend();
$smtp->quit;


Wednesday, July 27, 2016

Linux: Zipping Directories into Zipped Files Recursively

Here's a perl script to do it. The idea is from using find2perl, which can construct how perl would do Unix/Linux find command. Using it as a base, a simple script can be very powerful. Note the use of $prune. This is equivalent to find's -prune option. It is used to stop traversing below the current directory.

  1. #!/usr/bin/perl -w
  2. #
  3. # sqzfps.pl -- compress FPS batch files
  4. #
  5. # CJKim, 27-Jul-2016, Created
  6. #

  7. use strict;
  8. use File::Find ();
  9. use vars qw/*name *dir *prune/;
  10. *name   = *File::Find::name;
  11. *dir    = *File::Find::dir;
  12. *prune  = *File::Find::prune;

  13. my $topdir = 'some-top-directory';
  14. my $aging  = 90; # 365;
  15. my $cputhr = 50;
  16. my $cpuchk = 50;
  17. my $dircnt = 0;
  18. $| = 1;

  19. sub cpu_ok
  20. {
  21.     $dircnt++;
  22.     if ($dircnt > $cpuchk) {
  23.         $dircnt = 0;
  24.         while (1) {
  25.             open CPU, 'vmstat 1 2 | awk \'{if (++c == 4) print $15;}\' |';
  26.             my $idle = <CPU>;
  27.             close CPU;
  28.             $idle += 0;
  29.             last if $idle > $cputhr;
  30.             print "Sleeping for 30 seconds since CPU is at $idle% idle...\n";
  31.             sleep 30;
  32.         }
  33.     }
  34.     return 1;
  35. }

  36. sub wanted
  37. {
  38.     # $_ is the file name only
  39.     # $name is the full path
  40.     # $dir is just the directory name
  41.     # $prune may be set to 1 to stop traversing below the current dir

  42.     my ($dev,$ino,$mode,$nlink,$uid,$gid);

  43.     (($dev,$ino,$mode,$nlink,$uid,$gid) = lstat($_))
  44.     && (-d _)  # directory only
  45.     && (int(-M _) > $aging)  # so many days old
  46.     && ($_ =~ /^\d{16}FF$/)  # fits the name pattern
  47.     && ($prune = 1)  # stop traversing after this one
  48.     && cpu_ok()  # check if we have enough cpu available
  49.     && system "(echo 'Processing $_'; cd $name && zip -m $dir/$_.zip *.* && cd / && rmdir $name)"
  50.     ;
  51. }


  52. # traverse the filesystem
  53. File::Find::find({wanted => \&wanted}, $topdir);



Running the Same Command Periodically and Watch the Output

Have you ever find yourself typing the same command over and over again just to see what's been changing? On a standard Linux, there is a little-known utility called "watch". It can run a command repeatedly every so many seconds and shows you the output. But it goes one step further by showing the differences between the last run and current by highlighting the differences. Yes, it's a clunky command line tool but does a beautiful job of saving a lot of typing and figuring out what changed. See the man pages for watch(1) for the detail and the option flags.

I often used it with SQL*Plus to see the changes in database. For example, you can have a SQL statement in a file called, say, count.sql. You can invoke it using SQL*Plus but you will see the output only one time. To see the output periodically, you can invoke it with watch(1), e.g.

watch -d sqlplus scott/tiger@xe @count

Don't forget to have "EXIT" as the last command in count.sql so that sqlplus would exit after executing the SQL statements.

Cloning a Directory Tree in Linux/Unix

Here's an old command that clones an entire directory tree in Unix or Linux. So far, the command appears to work on all variants of Unix (HP-UX, Solaris, AIX, RedHat, Ubuntu, CentOS, and Fedora):

    $ cd to_the_directory_to_be_cloned  
    $ find . -depth -print | cpio -pdv destination_directory  

While the above command clones all files, since find(1) offers many options, you can clone selected files. For example, if you wish to clone only java programs, specify the option "-name '*.java'" to do so.

Friday, June 17, 2016

Accessing DB2 Mainframe (z/OS) from PowerShell


To access DB2 on Mainframe (z/OS), a DB2 ODBC driver must be downloaded from IBM and installed.
A driver may be downloaded from the IBM web site:

http://www-933.ibm.com/support/fixcentrl/swg/downloadFixes

For example, "v9.,7fp11_ntx64_odbc_cli.zip" may be used.

After downloading it, unzip it in C:\Program Files\IBM.

Once unzipped, execute the following in the command line as the system administrator:

C:> cd "\Program Files\IBM\clidriver\bin"
C:> db2cli install -setup

Use the Administrative tool "Data Sources (ODBC)" to ensure that the driver is installed properly. If not, the command can be executed again. Since db2cli is not very verbose, it may be necessary to do it again. Also, note that if you run it as a regular user, it will display the success message but the installation does not occur.

Once you have it, you can then connect to DB2 using System.Data.Odbc.OdbcConnection as shown here:

$db2conn = New-Object System.Data.Odbc.OdbcConnection
$cstr = "Driver={IBM DB2 ODBC DRIVER - C_PROGRA~1_IBM_CLIDRI~1};"
$cstr += "Database=DB2Database;Hostname=DB2Server;Port=5000;"
$cstr += "UID=DB2User;PWD=DB2Pswd;"
$db2conn.ConnectionString = $cstr
$db2conn.Open()
From here, you can use all the ODBC functions to interact with DB2.

A prepared statement can be used with ODBC. To do so, use OdbcCommand() to submit a query with bind variables specified with ? for each. For strings, there is no need to surround it with quotes. Then specify the value for them with Parameters.Add(). The first Parameters.Add() corresponds to the first ?, the second to the second ?, and so on. The variable types can be specified with [System.Data.Odbc.OdbcType] enumerator. The supported column types are: BigInt, Binary, Bit, Char, Date, DateTime, Decimal, Double, Image, Int, NChar, NText, Numeric, NVarChar, Real, SmallDateTime, SmallInt, Text, Time, Timestamp, TinyInt, UniqueIdentifier, VarBinary, and VarChar.

    $qry = @'
SELECT col1, col2, ..., coln
  FROM table
 WHERE col1 LIKE ?
'@
    $stmt = New-Object System.Data.Odbc.OdbcCommand($qry, $db2conn)
    $stmt.Parameters.Add('@col1', [System.Data.Odbc.OdbcType]::varchar, 20).Value = "$col1%"
    try {
        $xr = $stmt.ExecuteReader()

        while ($xr.Read()) {
            # do something with the returned data, which is the form of
            # $xr.GetValue(column-number-starting-at-0)
        }
    }
    catch {
        Write-Host "Executing DB2 $qry failed"
        Write-Host $Error[0].Exception
        Exit(-2)
    }
    $stmt.Dispose()


Saturday, May 14, 2016

PHP: Serve up a CSV file from SQL Server

Here's a simple PHP script to build a CSV file to be sent to the users. The script is designed to work with all SQL statements by dynamically determining the column names and # of columns. Note that datetime is returned as a string such that no additional post-processing is necessary.

    // connect to sql server
    $connInfo = array('Database' => 'database_name',
                      'UID'      => 'sqlsvr_login',
                      'PWD'      => 'password',
                      'ReturnDatesAsStrings' => true);
    $mssql = sqlsrv_connect('sqlserver', $connInfo);
    if ($mssql === false) {
        die('<pre>' . print_r(sqlsrv_errors(), true) . '</pre>');
    }
    // execute the stored procedure to get the report data
    $sql = 'SELECT xxx FROM xxx WHERE xxx';
    $res = sqlsrv_query($mssql, $sql);
    if ($res === false) {
        die('<pre>' . print_r(sqlsrv_errors(), true) . '</pre>');
    }
    // get the column names
    $cols = sqlsrv_field_metadata($res);
    $ncol = sqlsrv_num_fields($res);
    $content = '';
    for ($i = 0; $i < $ncol; $i++) {
        if ($i > 0) $content .= ',';
        $content .= $cols[$i]['Name'];
    }
    $content .= "\n";
    // get the rows
    while ($row = sqlsrv_fetch($res)) {
        for ($i = 0; $i < $ncol; $i++) {
            if ($i > 0) $content .= ',';
            $content .= sqlsrv_get_field($res, $i);
        }
        $content .= "\n";
    }
    // we are done
    sqlsrv_free_stmt($res);
    sqlsrv_close($mssql);
    // write to the file
    file_put_contents($file, $content);

    // start sending the results back to the user
    header('Content-Disposition: attachment; filename="some.csv"');
    header('Content-Length: ' . filesize($file));
    header('Content-Type: application/octet-stream;');
    readfile($file);



Friday, May 13, 2016

perl DBI-ODBC for SQL Server on Linux

perl DBI is an amazingly versatile tool to support all sorts of database connectivity including Microsoft SQL Server. However, the driver for SQL Server is not included with any perl distribution. The connection topology for perl to SQL Server is:

perl --> DBI --> DBD::ODBC --> unixODBC --> Microsoft Driver --> SQL Server

First, DBI is required. Fortunately, every modern perl distributions come with DBI. So, nothing needs to be done for this one.

Next, an ODBC driver is required. One can use FreeTDS but it's not complete for SQL Server. (Note that FreeTDS is needed to connect to Sybase ASE.) It's better to get a driver from Microsoft. Thankfully, Microsoft offers a driver for SQL Server on Linux, which can be downloaded from Microsoft. URL is:

https://msdn.microsoft.com/en-us/library/hh568451%28v=sql.110%29.aspx

The driver is usually installed in /opt/microsoft but it can be installed just about anywhere.

Next, unixODBC is required. This can be obtained from unixodbc.org. Download the latest, compile, and install. For the illustration purpose, assume that it's installed under /usr/local.

unixODBC creates two files in its directory (e.g. /usr/local/etc): odbc.ini and odbcinst.ini

odbcinst.ini is where the lower driver is specified.
odbc.ini is where DSNs are specified.

unixODBC will first check the user's home directory for .odbc.ini. If not found, it checks for /etc/odbc.ini. If not found, it uses the one in its own directory (e.g. /usr/local/etc).

Since the low-level driver requires unixODBC libraries, the location of the libraries (e.g. /usr/local/lib) must be specified with environment variable LD_LIBRARY_PATH.

Without it specified, your program will experience "segmentation fault" with no warning or error messages. Sometimes you may also get "SQLGetPrivateProfilestringW" error.

Next, DBD::ODBC for perl is required. This can be installed using cpan command.

$ cpan
cpan> install DBD::ODBC

If everything is good, it should be installed successfully. Some of the tests probably will fail or skipped. But don't worry about them as long as they get installed.

A perl script can be written using DBI function. When connecting to a SQL Server, use the following:

$mssql = DBI->connect('DSN=dnsName;UID=sqlserver-login;PWD=password');

where dnsName is one of the names in a pair of square brackets defined in odbc.ini. An entry in odbc.ini includes the server name and driver name. This driver name must be defined in odbcinst.ini.

Wednesday, May 11, 2016

Excel: Importing CSV Files Created in Different Locale Settings

When you work with a locale that is different from yours, something simply as importing CSV files can become a major problem. For example, in US, you would use the comma as the field separator and the period as the decimal separator. But this is not universal. In Europe, the comma is used as the decimal separator, semi-colon as field separator, and period as thousand separator. You can replace these characters to match your locale but Excel can help you do this with a simple macro.

There are two settings:
     Application.DecimalSeparator
     Application.UseSystemSeparators

Importing CSV files can be adjusted based on these two setting. Here's a general idea to follow:

  1. Save the current settings to variables
  2. Ask what DecimalSeparator ought to be or determine it from the input file
  3. Set UseSystemSeparators to False after changing DecimalSeparator
  4. Import CSV files using ActiveSheet.QueryTables.Add function while setting .TextFileTabDelimiterTextFileSemicolonDelimiter,TextFileCommaDelimiterTextFileSpaceDelimiter, and/or TextFileOtherDelimiter. If the delimiter is not any of Tab, Semicolon, Comma, or Space, set TextFileOtherDelimiter to the separator
  5. Put DecimalSeparator to the original value and UseSystemSeparators to True

Excel should magically replaces commas to period and period to commas, e.g. 2.450,23 becomes 2,450.23.

Tuesday, May 10, 2016

VBA: Recognizing Unix-style Files

I was trying to figure out a good way to determine if a file being read has a line terminator of CR/LF (Windows/DOS) or LF (Unix/Linux). Since VBA Line Input assumes that the file is in the Windows format (CR/LF line terminator), it reads in the entire file if the file is in the Unix format. Here's a function that determines how many columns are in a CSV file regardless the format format (except for the Mac format, which terminates each line with CR):
  1. ' ----------------------------------------------------------------------------  
  2. ' get the number of columns by reading the 1st line which should be the header  
  3. '  
  4. Private Function GetNumCols(pFile) As Integer  
  5.     Dim data As String  
  6.     Dim temp As Variant  
  7.   
  8.   
  9.     Open pFile For Input As #1  
  10.     Line Input #1, data  
  11.     Close #1  
  12.     ' play the following trick in case CSV is in the Unix format  
  13.     ' algorithm:  
  14.     ' 1. read one line. If Unix file, it will read the entire file  
  15.     ' 2. split the line with linefeed char  
  16.     ' 3. if there are more than 1 linefeed chars, it's unix  
  17.     ' 4. if Unix, split the first element of the array that were split by  
  18.     '    linefeed by ","  
  19.     ' 5. if Windows, split the input line by ","  
  20.     temp = Split(data, vbLf)  
  21.     If UBound(temp) > 2 Then  
  22.         temp = Split(temp(1), ",")  
  23.     Else  
  24.         temp = Split(data, ",")  
  25.     End If  
  26.     GetNumCols = UBound(temp) + 1  
  27. End Function  

SilkPerformer: Comments in Data Files

Often I have a need to include or exclude certain lines in the input data files. Since SilkPerformer does not allow such, I use the following pattern:
  1. // get the login info  
  2.     FileCSVLoadGlobal(ghFileLogin, gcsFileLogin);  
  3.     loop  
  4.       FileGetNextUniqueRow(ghFileLogin);  
  5.       sUsername := FileGetCol(ghFileLogin, 1, STRING_COMPLETE);  
  6.       sPassword := FileGetCol(ghFileLogin, 2, STRING_COMPLETE);  
  7.       if Substr(sUsername, sTemp, 11) <> "#" then exit end;  
  8.     end;  
For this way, any lines starts with a hash sign (or pound sign) can safely be ignored.