Published on :
ETL CREATION_INTERNE

Examples of delimiter management with INPUT and INFILE

This code is also available in: Deutsch Español Français
Awaiting validation
The script presents 7 use cases for importing text data: standard CSV reading with the DSD option, using the tilde (~) modifier for advanced quote handling, explicit length definition, using multiple delimiters simultaneously, defining the delimiter via a variable, dynamically reading the delimiter from the data itself, and using a complete character string as a separator (DLMSTR).
Data Analysis

Type : CREATION_INTERNE


Data is included directly in the code via DATALINES and CARDS statements.

1 Code Block
DATA STEP Data
Explanation :
Reading a standard CSV format. The `dlm=','` option specifies the comma as a separator and `dsd` (Delimiter Sensitive Data) correctly handles quotes and missing values.
Copied!
1title '1.3.3a Delimited List Input Modifiers';
2DATA base;
3LENGTH lname $15;
4INFILE DATALINES dlm=',' dsd;
5INPUT fname $ lname $ dob :mmddyy10.;
6DATALINES;
7'Sam','Johnson',12/15/1945
8'Susan','Mc Callister',10/10/1983
9RUN;
2 Code Block
DATA STEP Data
Explanation :
Using the `~` (tilde) modifier in the INPUT statement. Here it allows reading fields potentially containing delimiters (like the comma in 'Fresno, CA') while respecting the DSD structure.
Copied!
1title2 'Using the ~ Format Modifier';
2DATA base;
3LENGTH lname $15;
4INFILE DATALINES dlm=',' dsd;
5INPUT fname $ lname $ birthloc $~15. dob :mmddyy10. ;
6DATALINES;
7'Sam','Johnson', 'Fresno, CA','12/15/1945'
8'Susan','Mc Callister','Seattle, WA',10/10/1983
9RUN;
3 Code Block
DATA STEP Data
Explanation :
Reading where the length of variables is predefined by the `LENGTH` statement, allowing precise control over storage without a modifier in the INPUT.
Copied!
1title '1.3.3c Delimited List Input Modifiers';
2/* ... */
3DATA base;
4LENGTH lname birthloc $15;
5INFILE DATALINES dlm=',' dsd;
6INPUT fname $ lname $ birthloc $ dob :mmddyy10. ;
7DATALINES;
8/* Données ... */
9RUN;
4 Code Block
DATA STEP Data
Explanation :
Defining multiple possible delimiters for the same file. Here `dlm='/,'` indicates that either the slash OR the comma can serve as a separator.
Copied!
1title '1.3.3d Read delimited code with multiple delimiters';
2DATA imports;
3INFILE CARDS dlm='/,';
4INPUT id importcode $ value;
5CARDS;
614,1,13
725/Q9,15
86,D/20
9RUN;
5 Code Block
DATA STEP Data
Explanation :
Using a variable (`dlmvar`) to pass delimiters to the `dlm=` option of the INFILE statement, offering more flexibility (e.g., dynamic parameterization).
Copied!
1title '1.3.3e Read delimited code with multiple delimiters';
2title2 'Using a variable to specify the delimiter';
3DATA imports;
4retain dlmvar '/,';
5INFILE CARDS dlm=dlmvar;
6INPUT id importcode $ value;
7CARDS;
8/* Données */
9RUN;
6 Code Block
DATA STEP Data
Explanation :
Advanced technique: reading the first character of the line (`input dlmvar $1. @`) to determine the specific delimiter for that line, then re-reading the line (`infile cards dlm=dlmvar`) with this delimiter. (Note: The original source code contained a ' @code_sas_json...' artifact which has been cleaned here).
Copied!
1title '1.3.3f Read delimited code with multiple delimiters';
2title2 'Reading the delimiter during execution';
3DATA imports;
4INFILE CARDS;
5INPUT dlmvar $1. @;
6INFILE CARDS dlm=dlmvar;
7INPUT id importcode $ value;
8CARDS;
9,14,1,13
10/25/Q9/15
11~6~D~20
12RUN;
7 Code Block
DATA STEP Data
Explanation :
Using the `DLMSTR` (Delimiter String) option available in recent versions of SAS, allowing a complete character string (here ',,/') to be defined as a unique separator, rather than a list of individual characters.
Copied!
1title '1.3.3g Use a delimiter string';
2DATA imports;
3INFILE CARDS dlmstr=',,/';
4INPUT id importcode $ value;
5CARDS;
614,,/1/,,/13
7/* ... */
8RUN;
This material is provided "as is" by We Are Cas. There are no warranties, expressed or implied, as to merchantability or fitness for a particular purpose regarding the materials or code contained herein. We Are Cas is not responsible for errors in this material as it now exists or will exist, nor does We Are Cas provide technical support for it.