Home » SQL & PL/SQL » SQL & PL/SQL » SQL Macro using UTL_FILE (Oracle 19.0.0.0.0 Windows 64bit)
SQL Macro using UTL_FILE [message #690053] Thu, 26 September 2024 09:37 Go to next message
Duane
Messages: 562
Registered: December 2002
Senior Member
I have this SQL Macro that creates CSV output but I would like to have the Macro also write those results to a file on the OS system.

The problem I have is I need the results to be transformed before those results are written to the file.  

Any suggestions?



Desired Output written to the file:  


"ID","NAME"
1,"AMSTERDAM"
2,"UTRECHT"
3,"THE HAGUE"
4,"ROTTERDAM"



Test Case:


CREATE TABLE CITIES
(
  ID    NUMBER(5),
  NAME  VARCHAR2(30 CHAR)
)

SET DEFINE OFF;
Insert into CITIES
   (ID, NAME)
 Values
   (1, 'AMSTERDAM');
Insert into CITIES
   (ID, NAME)
 Values
   (2, 'UTRECHT');
Insert into CITIES
   (ID, NAME)
 Values
   (3, 'THE HAGUE');
Insert into CITIES
   (ID, NAME)
 Values
   (4, 'ROTTERDAM');
COMMIT;


SQL Macro


CREATE OR REPLACE function CSVDataFile (InputData     in dbms_tf.table_t,
                                        DirectoryName in varchar2 default 'DIR1',
                                        FileName      in varchar2 default null,
                                        CreateFile    in varchar2 default 'Y',
                                        Delimiter     in varchar2 default ',',
                                        Quote         in varchar2 default '"',
                                        Escape        in varchar2 default '\') return clob sql_macro as
     
  pragma          autonomous_transaction;
  
  ChunkSize       pls_integer  default 3000;
  DelimiterRecord varchar2(24) default '||Delimiter||';
  ColumnName      varchar2(200);
  ColumnRecord    varchar2(300);
  HeaderRecord    clob;
  DataRecord      clob;
  SQLStatement    clob;
  
  CSVFile         utl_file.file_type;
  
  
  begin
    for i in 1..InputData.column.count 
      loop
        ColumnName := InputData.column(i).description.name;
        
        case
          when InputData.column(i).description.type in (dbms_tf.type_varchar2, 
                                                        dbms_tf.type_char, 
                                                        dbms_tf.type_clob)
            then
              if Quote is not null
                then
                  ColumnRecord := 'replace('||ColumnName||','''||Quote||''','''||Escape||Quote||''')';
                  ColumnRecord := ''''||Quote||'''||'||ColumnRecord||'||'''||Quote||'''';
              
                else
                  ColumnRecord := ColumnName;
              end if;
              
          when InputData.column(i).description.type = dbms_tf.type_number
            then
              ColumnRecord := 'to_char('||ColumnName||')';
              
          when InputData.column(i).description.type = dbms_tf.type_date 
            then
              ColumnRecord := 'to_char('||ColumnName||',''YYYY-MM-DD'')';
              
          when InputData.column(i).description.type = dbms_tf.type_timestamp
            then
              ColumnRecord := 'to_char('||ColumnName||',''YYYY-MM-DD"T"HH24:MI:SS.FF6'')';
              
            else
              null;
        end case;
   
        HeaderRecord := HeaderRecord||'||Delimiter||'||''''||ColumnName||'''';
        DataRecord := DataRecord||DelimiterRecord||ColumnRecord;
   
      end loop;
    
    HeaderRecord := substr(HeaderRecord, length('||Delimiter||') + 1);
    DataRecord := substr(DataRecord, length(DelimiterRecord) + 1);
    
    SQLStatement := q'[
                       select @COLUMN_HEADER@ as csv_row from dual
                       union all
                       select @DATA_RECORD@ as csv_row from CSVData.InputData
                      ]';
   
    SQLStatement := replace(replace(SQLStatement, '@DATA_RECORD@', DataRecord), '@COLUMN_HEADER@', HeaderRecord);
    
    if CreateFile = 'Y'
     then
       CSVFile := utl_file.fopen(nvl(DirectoryName, 'DIR1'), nvl(FileName, 'csv_file_'||to_char(sysdate, 'yyyymmddhh24miss')||'.txt'), 'wb', max_linesize => 32767);

       for i in 1..ceil(length(HeaderRecord)/ChunkSize)
         loop
           utl_file.put_raw(CSVFile, utl_raw.cast_to_raw(substr(HeaderRecord, (i - 1) * ChunkSize + 1, ChunkSize)));
           utl_file.fflush(CSVFile);
         end loop;
         
       ChunkSize := 3000;
       
       for i in 1..ceil(length(DataRecord)/ChunkSize)
         loop
           utl_file.put_raw(CSVFile, utl_raw.cast_to_raw(substr(DataRecord, (i - 1) * ChunkSize + 1, ChunkSize)));
           utl_file.fflush(CSVFile);
         end loop;

       utl_file.fclose(CSVFile);
    end if;

    return SQLStatement;
  
  end;


SQL Statement


with Cities1 as (select *
                 from cities)
                 
  select * from dnettles10.CSVDataFile(InputData => Cities1);


File Contents


'"ID"'||Delimiter||'"NAME"'to_char("ID")||Delimiter||'"'||replace("NAME",'"','\"')||'"'

Re: SQL Macro using UTL_FILE [message #690054 is a reply to message #690053] Fri, 27 September 2024 16:40 Go to previous messageGo to next message
Solomon Yakobson
Messages: 3285
Registered: January 2010
Location: Connecticut, USA
Senior Member
Very bad idea. And not really possible without exploiting loopholes that can be closed in newer releases. SQL macro doesn't execute SQL statement. It just produces to be executed SQL statement text. So obviously it is text:

'"ID"'||Delimiter||'"NAME"'to_char("ID")||Delimiter||'"'||replace("NAME",'"','\"')||'"'
that is written to file. So SQL macro would need to produce SQL text and then execute it. Issue is SQL macro creates SQL text with placeholders. If you add:

dbms_output.put_line(SQLStatement);
You'll see:

select '"ID"'||Delimiter||'"NAME"' as csv_row from dual
union all
select
to_char("ID")||Delimiter||'"'||replace("NAME",'"','\"')||'"' as csv_row from
CSVData.InputData
And then Delimiter is replaced with parameter Delimiter value and CSVData.InputData with table name. And very nasty thing in SQL macro - string parameter values are not available aouside RETURN statement. We will get NULL, not comma if we reference parameter Delimter. Fortunately (and this can change in future releases) UDT values are not masked, so we can change Delimiter type to sys.OdciVarchar2List and add parameter TblName of same sys.OdciVarchar2List type. Now we can construct SQL without placeholders:

CREATE OR REPLACE function CSVDataFile (
                                        TblName       in sys.OdciVarchar2List,
                                        InputData     in dbms_tf.table_t,
                                        DirectoryName in varchar2 default 'TEMP',
                                        FileName      in varchar2 default null,
                                        CreateFile    in varchar2 default 'Y',
                                        Delimiter     in sys.OdciVarchar2List default sys.OdciVarchar2List(','),
                                        Quote         in varchar2 default '"',
                                        Escape        in varchar2 default '\'
                                      ) return clob sql_macro as
     
  pragma          autonomous_transaction;
  
  ChunkSize       pls_integer  default 3000;
  DelimiterRecord varchar2(24) default '||''' || Delimiter(1) || '''||';
  ColumnName      varchar2(200);
  ColumnRecord    varchar2(300);
  HeaderRecord    clob;
  DataRecord      clob;
  SQLStatement    clob;
  v_cur           sys_refcursor;
  v_rec           clob;
  
  CSVFile         utl_file.file_type;
  
  
  begin
    for i in 1..InputData.column.count 
      loop
        ColumnName := InputData.column(i).description.name;
        
        case
          when InputData.column(i).description.type in (dbms_tf.type_varchar2, 
                                                        dbms_tf.type_char, 
                                                        dbms_tf.type_clob)
            then
              if Quote is not null
                then
                  ColumnRecord := 'replace('||ColumnName||','''||Quote||''','''||Escape||Quote||''')';
                  ColumnRecord := ''''||Quote||'''||'||ColumnRecord||'||'''||Quote||'''';
              
                else
                  ColumnRecord := ColumnName;
              end if;
              
          when InputData.column(i).description.type = dbms_tf.type_number
            then
              ColumnRecord := 'to_char('||ColumnName||')';
              
          when InputData.column(i).description.type = dbms_tf.type_date 
            then
              ColumnRecord := 'to_char('||ColumnName||',''YYYY-MM-DD'')';
              
          when InputData.column(i).description.type = dbms_tf.type_timestamp
            then
              ColumnRecord := 'to_char('||ColumnName||',''YYYY-MM-DD"T"HH24:MI:SS.FF6'')';
              
            else
              null;
        end case;
   
        HeaderRecord := HeaderRecord||DelimiterRecord||''''||ColumnName||'''';
        DataRecord := DataRecord||DelimiterRecord||ColumnRecord;
   
      end loop;
    
    HeaderRecord := substr(HeaderRecord,length(DelimiterRecord) + 1);
    DataRecord := substr(DataRecord,length(DelimiterRecord) + 1);
    
    SQLStatement := '
                      select @COLUMN_HEADER@ as csv_row from dual
                     union all
                      select @DATA_RECORD@ as csv_row from ' || TblName(1);
   
    SQLStatement := replace(replace(SQLStatement, '@DATA_RECORD@', DataRecord), '@COLUMN_HEADER@', HeaderRecord);

    if CreateFile = 'Y'
     then
       CSVFile := utl_file.fopen(nvl(DirectoryName, 'TEMP'), nvl(FileName, 'csv_file_'||to_char(sysdate, 'yyyymmddhh24miss')||'.txt'), 'wb', max_linesize => 32767);
       open v_cur
         for SQLStatement;
       loop
         fetch v_cur
           into v_rec;
         exit when v_cur%notfound;
         for i in 1..ceil(length(v_rec)/ChunkSize)
         loop
           utl_file.put_raw(CSVFile, utl_raw.cast_to_raw(substr(v_rec, (i - 1) * ChunkSize + 1, ChunkSize)));
           utl_file.fflush(CSVFile);
         end loop;
         utl_file.put_raw(CSVFile, utl_raw.cast_to_raw(chr(10)));
         utl_file.fflush(CSVFile);

       end loop;
       utl_file.fflush(CSVFile);
       utl_file.fclose(CSVFile);
    end if;
    return SQLStatement;
  
  end;
/

Function created.

SQL> set serveroutput on
SQL> host dir c:\temp\csv_file*.txt
 Volume in drive C is OSDisk
 Volume Serial Number is 08EC-009C

 Directory of c:\temp

File Not Found

SQL> select * from CSVDataFile(TblName => sys.OdciVarchar2List('CITIES'),InputData => CITIES)
  2  /

CSV_ROW
--------------------------------------------------------------------------------
"ID","NAME"
1,"AMSTERDAM"
2,"UTRECHT"
3,"THE HAGUE"
4,"ROTTERDAM"

SQL> host dir c:\temp\csv_file*.txt
 Volume in drive C is OSDisk
 Volume Serial Number is 08EC-009C

 Directory of c:\temp

09/27/2024  05:32 PM                66 csv_file_20240927173233.txt
               1 File(s)             66 bytes
               0 Dir(s)  24,827,981,824 bytes free

SQL> host type c:\temp\csv_file*.txt

c:\temp\csv_file_20240927173233.txt


"ID","NAME"
1,"AMSTERDAM"
2,"UTRECHT"
3,"THE HAGUE"
4,"ROTTERDAM"

SQL>
But now is the tricky part. Each time we issue SQL statement Oracle checks if we already have it in shared pool where it is already hard parsed. Therefore if we issue the above select second time Oracle will not call SQL macro and therefore no file will be created:

SQL> host del c:\temp\csv_file*.txt

SQL> host dir c:\temp\csv_file*.txt
 Volume in drive C is OSDisk
 Volume Serial Number is 08EC-009C

 Directory of c:\temp

File Not Found

SQL> select * from CSVDataFile(TblName => sys.OdciVarchar2List('CITIES'),InputData => CITIES)
  2  /

CSV_ROW
--------------------------------------------------------------------------------
"ID","NAME"
1,"AMSTERDAM"
2,"UTRECHT"
3,"THE HAGUE"
4,"ROTTERDAM"

SQL> host dir c:\temp\csv_file*.txt
 Volume in drive C is OSDisk
 Volume Serial Number is 08EC-009C

 Directory of c:\temp

File Not Found

SQL>
That is why I said it is very bad idea. We would have to issue slightly different at least by 1 character SQL that calls SQL macro each time to force Oracle executing it each time. For example, I will add a space before 'CITIES':

SQL> select * from CSVDataFile(TblName => sys.OdciVarchar2List( 'CITIES'),InputData => CITIES)
  2  /

CSV_ROW
--------------------------------------------------------------------------------
"ID","NAME"
1,"AMSTERDAM"
2,"UTRECHT"
3,"THE HAGUE"
4,"ROTTERDAM"

SQL> host dir c:\temp\csv_file*.txt
 Volume in drive C is OSDisk
 Volume Serial Number is 08EC-009C

 Directory of c:\temp

09/27/2024  05:38 PM                66 csv_file_20240927173856.txt
               1 File(s)             66 bytes
               0 Dir(s)  24,835,670,016 bytes free

SQL>
SY.

[Updated on: Fri, 27 September 2024 16:42]

Report message to a moderator

Re: SQL Macro using UTL_FILE [message #690055 is a reply to message #690054] Fri, 27 September 2024 17:24 Go to previous messageGo to next message
Duane
Messages: 562
Registered: December 2002
Senior Member
Solomon Yakobson wrote on Fri, 27 September 2024 21:40


And then Delimiter is replaced with parameter Delimiter value and CSVData.InputData with table name. And very nasty thing in SQL macro - string parameter values are not available aouside RETURN statement. We will get NULL, not comma if we reference parameter Delimter. Fortunately (and this can change in future releases) UDT values are not masked, so we can change Delimiter type to sys.OdciVarchar2List and add parameter TblName of same sys.OdciVarchar2List type. Now we can construct SQL without placeholders:

I did printed out the SQLStatement text but didn't know how to transform the values.  You resolved that part.


Solomon Yakobson wrote on Fri, 27 September 2024 21:40


But now is the tricky part. Each time we issue SQL statement Oracle checks if we already have it in shared pool where it is already hard parsed. Therefore if we issue the above select second time Oracle will not call SQL macro and therefore no file will be created:

I saw that while testing.  I had noticed that a file would be created but then running the statement again didn't create another file.  I would have to change something in the Function and then compile again for a new file to be created.  That explains it now.

Anyway, I still think it's a cool idea but doesn't sound like it's practical if you have to change something in the query statement each time.  That won't work.

Thanks again.
Re: SQL Macro using UTL_FILE [message #690056 is a reply to message #690055] Fri, 27 September 2024 19:30 Go to previous message
Duane
Messages: 562
Registered: December 2002
Senior Member
Just out of curiosity, no way to make it work with "CSVData.InputData"?  It will only work with a defined table if results are to be written to a file?

select @DATA_RECORD@ as csv_row from CSVData.InputData;
Previous Topic: How avoid More than one Join With Same table (merged)
Next Topic: Row generator
Goto Forum:
  


Current Time: Sat Sep 28 01:18:49 CDT 2024