When loading data into CAS (Cloud Analytic Services) via an ODBC interface, particularly from databases like MySQL, you may encounter a blocking error related to the management of varying data buffers.
This article analyzes the causes of this malfunction and the procedure to follow to correct it.
The typical scenario is as follows: you use PROC CASUTIL to load a table from an external source defined via an ODBC DSN (in odbc.ini).
Your code looks like this:
Instead of loading the table, the execution stops and the log displays the following message:
You might think it's a configuration problem with the SAS©/ACCESS to ODBC interface or an error in the odbc.ini file, but the cause is deeper.
This error is the result of an identified bug in the management of certain mixed data types by the CAS engine during ingestion.
The problem occurs specifically when the structure of your source table combines two precise characteristics:
The presence of large objects of type BLOB (Binary Large Object) or CLOB (Character Large Object).
The presence, in the same row, of an empty or null (NULL) value for a column of type VARCHAR or VARBINARY.
It is this specific combination (Large Object + Null Varchar) that causes the allocated buffer memory to be exceeded.
Unfortunately, there is no functional workaround at the SAS© code or ODBC configuration level to avoid this error if your data meets the criteria above.
The resolution requires a software update (Hot Fix).
Resolution Procedure
Identify the fix: The fix is located in the update of the sas©-tkcdfts.x86_64 system package.
Contact support: It is recommended to open a ticket with SAS© technical support to report the problem. This will allow you to be officially notified of the availability of the fix for your specific version and to associate your environment with the identified software defect.
Apply the update: Once the fix is available, installing the updated version of the package mentioned above will definitively resolve the buffer error.