Code Project

Link Unit

Monday, August 26, 2013

Variable Scope in T-SQL & PL/SQL

As per general programming point of view variable declared inner most block should not be available to outer blocks. Let's see if we can conclude how it is approached in T-SQl and PL/SQL.

Scope of TSQL Variables
begin
declare @a int -- vaiable in outer begin/end
set @a=20
Select @a
    begin
        Declare @b int -- @b declared inner begin/end
        set @b=30
        Select @b
    end
Select @a,@b -- @b is accessible
end


SQL Server scoping rules are per batch. BEGIN/END have no meaning on scope in this example.

That is, the variable is *not* declared per BEGIN/END. It is declared once in the batch, so accessible to batch.

MSDN (http://msdn.microsoft.com/en-us/library/ms188927.aspx) says “The scope of a local variable is the batch in which it is declared. “

Scope of PS/SQL Variables

PL/SQL allows the nesting of Blocks within Blocks i.e, the Execution section of an outer block can contain inner blocks. Therefore, a variable which is accessible to an outer Block is also accessible to all nested inner Blocks. The variables declared in the inner blocks are not accessible to outer blocks. Based on their declaration we can classify variables into two types.

Local variables - These are declared in a inner block and cannot be referenced by outside Blocks.
Global variables - These are declared in a outer block and can be referenced by its itself and by its inner blocks.







Declare
      pa number; /* vaiable in outer begin/end*/

Begin
      pa :=20;

      declare 
          pb number; /* declared inner begin/end */
      begin

         pb:=30;
      end;    pc number;
    pc:= pa+pb; /* pa is accessible while pb is not */

end;


Hope it helps.

Tuesday, August 20, 2013

provider: TCP Provider, error:

Error:
A transport-level error has occurred when sending the request to the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)

General Explanation:
This error will show if a connection is drawn from the connection pool and the connection to the server has been lost.
There is no way for a connection in the pool to know that the connection has been severed.

Reason:
There are a few common reasons for this.
1 The server has been restarted, this will close the existing connections.
2 Someone or something has killed the SPID that is being used. Connection drop or Deadlock where a victim is chosen could be one of the reason.

Probale Solutions:
#1 Connection pooling (Not recommended)
Because of reason 1, we tried turning off connection pooling at application level even that did not help.

#2 TCP Chimney Offload
This may also be caused by other, non SQL Server related, reasons. 
I have for example seen issues where the TCP Chimney Offload feature on the server machine is turned ON, causing this. Short info about this feature;
When TCP Chimney Offload is enabled and the NIC implements what is called the TCP Offload Engine, then some of the TCP processing is handed over to the hardware, i.e. the NIC.
By doing this, the CPU is offloaded, and since TCP could require a lot of processing this would mean that the CPU will be allowed to perform other tasks.

We tried by disabling it on server but it did not help.

#3. Deadlock
We checked using profiler for deadlock events, but there were none in trace.
 
#4 Setting Static IP address on NIC
It helped us in solving the issue.



Hope it helps.

Thursday, August 08, 2013

Why is the Data Loader importing dates incorrectly?


We were trying to upload to an object.Most of the dates uploaded exactly as defined in the CSV file, however others were one day out e.g 2-Jul uploaded as 1-Jul.

While using DataLoader it turns out that the Time Zone setting within data loader has a big impact.

Solution #1:

we changed it to "GMT" all the dates loaded without issue.
We also check the "Use European date format (dd/mm/yyyy)" within Data Loader.

Another option we worked out is specifying these setting in process-conf.xml file.

Solution #2 (for Automating Data Loader)
process-conf.xml
....


....

specified the dates in CSV file as dd/mm/yyyy

Hope it Helps