Saturday, May 14, 2016

PHP: Serve up a CSV file from SQL Server

Here's a simple PHP script to build a CSV file to be sent to the users. The script is designed to work with all SQL statements by dynamically determining the column names and # of columns. Note that datetime is returned as a string such that no additional post-processing is necessary.

    // connect to sql server
    $connInfo = array('Database' => 'database_name',
                      'UID'      => 'sqlsvr_login',
                      'PWD'      => 'password',
                      'ReturnDatesAsStrings' => true);
    $mssql = sqlsrv_connect('sqlserver', $connInfo);
    if ($mssql === false) {
        die('<pre>' . print_r(sqlsrv_errors(), true) . '</pre>');
    }
    // execute the stored procedure to get the report data
    $sql = 'SELECT xxx FROM xxx WHERE xxx';
    $res = sqlsrv_query($mssql, $sql);
    if ($res === false) {
        die('<pre>' . print_r(sqlsrv_errors(), true) . '</pre>');
    }
    // get the column names
    $cols = sqlsrv_field_metadata($res);
    $ncol = sqlsrv_num_fields($res);
    $content = '';
    for ($i = 0; $i < $ncol; $i++) {
        if ($i > 0) $content .= ',';
        $content .= $cols[$i]['Name'];
    }
    $content .= "\n";
    // get the rows
    while ($row = sqlsrv_fetch($res)) {
        for ($i = 0; $i < $ncol; $i++) {
            if ($i > 0) $content .= ',';
            $content .= sqlsrv_get_field($res, $i);
        }
        $content .= "\n";
    }
    // we are done
    sqlsrv_free_stmt($res);
    sqlsrv_close($mssql);
    // write to the file
    file_put_contents($file, $content);

    // start sending the results back to the user
    header('Content-Disposition: attachment; filename="some.csv"');
    header('Content-Length: ' . filesize($file));
    header('Content-Type: application/octet-stream;');
    readfile($file);



Friday, May 13, 2016

perl DBI-ODBC for SQL Server on Linux

perl DBI is an amazingly versatile tool to support all sorts of database connectivity including Microsoft SQL Server. However, the driver for SQL Server is not included with any perl distribution. The connection topology for perl to SQL Server is:

perl --> DBI --> DBD::ODBC --> unixODBC --> Microsoft Driver --> SQL Server

First, DBI is required. Fortunately, every modern perl distributions come with DBI. So, nothing needs to be done for this one.

Next, an ODBC driver is required. One can use FreeTDS but it's not complete for SQL Server. (Note that FreeTDS is needed to connect to Sybase ASE.) It's better to get a driver from Microsoft. Thankfully, Microsoft offers a driver for SQL Server on Linux, which can be downloaded from Microsoft. URL is:

https://msdn.microsoft.com/en-us/library/hh568451%28v=sql.110%29.aspx

The driver is usually installed in /opt/microsoft but it can be installed just about anywhere.

Next, unixODBC is required. This can be obtained from unixodbc.org. Download the latest, compile, and install. For the illustration purpose, assume that it's installed under /usr/local.

unixODBC creates two files in its directory (e.g. /usr/local/etc): odbc.ini and odbcinst.ini

odbcinst.ini is where the lower driver is specified.
odbc.ini is where DSNs are specified.

unixODBC will first check the user's home directory for .odbc.ini. If not found, it checks for /etc/odbc.ini. If not found, it uses the one in its own directory (e.g. /usr/local/etc).

Since the low-level driver requires unixODBC libraries, the location of the libraries (e.g. /usr/local/lib) must be specified with environment variable LD_LIBRARY_PATH.

Without it specified, your program will experience "segmentation fault" with no warning or error messages. Sometimes you may also get "SQLGetPrivateProfilestringW" error.

Next, DBD::ODBC for perl is required. This can be installed using cpan command.

$ cpan
cpan> install DBD::ODBC

If everything is good, it should be installed successfully. Some of the tests probably will fail or skipped. But don't worry about them as long as they get installed.

A perl script can be written using DBI function. When connecting to a SQL Server, use the following:

$mssql = DBI->connect('DSN=dnsName;UID=sqlserver-login;PWD=password');

where dnsName is one of the names in a pair of square brackets defined in odbc.ini. An entry in odbc.ini includes the server name and driver name. This driver name must be defined in odbcinst.ini.

Wednesday, May 11, 2016

Excel: Importing CSV Files Created in Different Locale Settings

When you work with a locale that is different from yours, something simply as importing CSV files can become a major problem. For example, in US, you would use the comma as the field separator and the period as the decimal separator. But this is not universal. In Europe, the comma is used as the decimal separator, semi-colon as field separator, and period as thousand separator. You can replace these characters to match your locale but Excel can help you do this with a simple macro.

There are two settings:
     Application.DecimalSeparator
     Application.UseSystemSeparators

Importing CSV files can be adjusted based on these two setting. Here's a general idea to follow:

  1. Save the current settings to variables
  2. Ask what DecimalSeparator ought to be or determine it from the input file
  3. Set UseSystemSeparators to False after changing DecimalSeparator
  4. Import CSV files using ActiveSheet.QueryTables.Add function while setting .TextFileTabDelimiterTextFileSemicolonDelimiter,TextFileCommaDelimiterTextFileSpaceDelimiter, and/or TextFileOtherDelimiter. If the delimiter is not any of Tab, Semicolon, Comma, or Space, set TextFileOtherDelimiter to the separator
  5. Put DecimalSeparator to the original value and UseSystemSeparators to True

Excel should magically replaces commas to period and period to commas, e.g. 2.450,23 becomes 2,450.23.

Tuesday, May 10, 2016

VBA: Recognizing Unix-style Files

I was trying to figure out a good way to determine if a file being read has a line terminator of CR/LF (Windows/DOS) or LF (Unix/Linux). Since VBA Line Input assumes that the file is in the Windows format (CR/LF line terminator), it reads in the entire file if the file is in the Unix format. Here's a function that determines how many columns are in a CSV file regardless the format format (except for the Mac format, which terminates each line with CR):
  1. ' ----------------------------------------------------------------------------  
  2. ' get the number of columns by reading the 1st line which should be the header  
  3. '  
  4. Private Function GetNumCols(pFile) As Integer  
  5.     Dim data As String  
  6.     Dim temp As Variant  
  7.   
  8.   
  9.     Open pFile For Input As #1  
  10.     Line Input #1, data  
  11.     Close #1  
  12.     ' play the following trick in case CSV is in the Unix format  
  13.     ' algorithm:  
  14.     ' 1. read one line. If Unix file, it will read the entire file  
  15.     ' 2. split the line with linefeed char  
  16.     ' 3. if there are more than 1 linefeed chars, it's unix  
  17.     ' 4. if Unix, split the first element of the array that were split by  
  18.     '    linefeed by ","  
  19.     ' 5. if Windows, split the input line by ","  
  20.     temp = Split(data, vbLf)  
  21.     If UBound(temp) > 2 Then  
  22.         temp = Split(temp(1), ",")  
  23.     Else  
  24.         temp = Split(data, ",")  
  25.     End If  
  26.     GetNumCols = UBound(temp) + 1  
  27. End Function  

SilkPerformer: Comments in Data Files

Often I have a need to include or exclude certain lines in the input data files. Since SilkPerformer does not allow such, I use the following pattern:
  1. // get the login info  
  2.     FileCSVLoadGlobal(ghFileLogin, gcsFileLogin);  
  3.     loop  
  4.       FileGetNextUniqueRow(ghFileLogin);  
  5.       sUsername := FileGetCol(ghFileLogin, 1, STRING_COMPLETE);  
  6.       sPassword := FileGetCol(ghFileLogin, 2, STRING_COMPLETE);  
  7.       if Substr(sUsername, sTemp, 11) <> "#" then exit end;  
  8.     end;  
For this way, any lines starts with a hash sign (or pound sign) can safely be ignored.

Oracle: Logging stored procedure messages

For simple cases, DBMS_OUTPUT.PUT_LINE can be used to log stored procedure messages. However, we all know that it's not useful when stored procedures are called from applications since there is no way to get the server output. Furthermore, not "draining" the server output will cause the connection to generate errors due to the output buffer being full.

One option is to use UTL_FILE such that all logging messages can be written to a file. Since UTL_FILE writes in a buffered mode, when two or more stored procedures are trying to write to the same file, we can have the output scrambled. To avoid such, FFLUSH can be invoked for each PUT_LINE. One drawback of using UTL_FILE is that the logging cannot be intermingled with the application that invokes the stored procedures. Since the application framework includes LOG4J, it perhaps is possible to create Java stored procedures that facilitate logging via LOG4J.

SilkPerformer: Capturing Browser Page with RaiseError

It's convenient to use RaiseError() function to terminate a transaction. But SilkPerformer does not capture the browser screen when RaiseError() is used even when TrueLog is set to true on error. To overcome this problem, you have to insert a fake click just before calling RaiseError to force SilkPerformer to capture the screen into TrueLog. For example, you can insert BrowserClick() for an DOM object does not exist. Such a click will fail and SilkPerformer will take a screen shot for TrueLog such that you can see what happened.

  function MyBrowserFind(
    isXPath   : string;
    inTimeout : number optional
  ) : number
  var
    nRetval   : number;
    nTimeout  : number;
  begin
    if inTimeout = 0 then
      nTimeout := 10000
    else
      nTimeout := inTimeout
    end;
    nRetval := BrowserFind(HANDLE_DESKTOP, isXPath, false, nTimeout, true);
    if (nRetval = HANDLE_INVALID) then
      // this fake click is here to cause TrueLog
      BrowserClick("//IMG[@id='ZZraiseErrorClickZZ']", BUTTON_Left);
      RaiseError(CUSTOM_ERROR,
            GetTransactionName() + ": Failed to find {" + isXPath + '}',
            SEVERITY_TRANS_EXIT)
    end;
    MyBrowserFind := nRetval;
  end MyBrowserFind;

Excel VBA: Convert R1C1 to A1

It's very convenient to use numbers for rows and columns when dealing with the Excel cells in VB. But there doesn't appear to be a way to convert numerical row/column (R1C1) to alphanumeric cell address (A1). Here's one way to do so:

  1. Private Function ConvR1C1ToAddr(pRow, pCol) As String  
  2.     ConvR1C1ToAddr = Range(Cells(pRow, pCol), Cells(pRow, pCol)).Address  
  3. End Function  

Since the column names can go from "A" to some like "CXZ", it's hard to build that without a lot of code.

For going the other way, simply use Range("A1").Row and Range("A1").Column.

Monday, May 9, 2016

SilkPerformer: String Trim Functions

SilkPerformer does not come with string trim functions. There are few suggestions as how one can write such functions. Here are my versions:

  1.   // -------------------------------------------------------------------------  
  2.   Function CSCStrLTrim(  
  3.     psStr : string  
  4.   ) : string  
  5.   var  
  6.     sTarget  : string;  
  7.     sTrimmed : string;  
  8.     nLen     : number;  
  9.   begin  
  10.     if StrRegexImmediate(psStr, "^[ \t\n\r\f]+""", sTarget) then  
  11.       nLen := Strlen(sTarget);  
  12.       if nLen > 0 then  
  13.         CSCStrLTrim := Substr(psStr, sTrimmed, nLen + 1, Strlen(psStr) - nLen)  
  14.       else  
  15.         CSCStrLTrim := psStr  
  16.       end;  
  17.     else  
  18.       CSCStrLTrim := psStr  
  19.     end;  
  20.   end CSCStrLTrim;  // -------------------------------------------------------------------------  
  21.   Function CSCStrRTrim(  
  22.     psStr : string  
  23.   ) : string  
  24.   var  
  25.     sTarget  : string;  
  26.     sTrimmed : string;  
  27.     nLen     : number;  
  28.   begin  
  29.     if StrRegexImmediate(psStr, "[ \t\n\r\f]+$""", sTarget) then  
  30.       nLen := Strlen(sTarget);  
  31.       if nLen > 0 then  
  32.         CSCStrRTrim := Substr(psStr, sTrimmed, 1, Strlen(psStr) - nLen)  
  33.       else  
  34.         CSCStrRTrim := psStr  
  35.       end;  
  36.     else  
  37.       CSCStrRTrim := psStr  
  38.     end;  
  39.   end CSCStrRTrim;  

The functions use the regular expression parser.