Monday, August 26, 2013

Temp table(# or ##) versus Table Variable(@)


A Table Variable, like any other variable, the scoping rules of the table variable are similar to any other programming variables. For example, if you define a variable inside a stored procedure, it can’t be accessed outside the stored procedure. A table variable can be very powerful when used with stored procedures to pass it as input/output parameters (new functionality available starting with SQL Server 2008) or to store the result of a table valued function.

·         The table variable is NOT necessarily memory resident. Under memory pressure, the pages belonging to a table variable can be pushed out to tempdb.

·         When you create a table variable, it is like a regular DDL operation and its metadata is stored in system catalog.

·         Table variables don’t participate in transactions or locking.

·         The operations done on table variable are not logged.

·         No DDL is allowed on table variables. So if you have a large rowset which needs to be queried often, you may want to use #table when possible so that you can create appropriate indexes. You can get around this by creating unique constraints when declaring table variable.

·         Finally, no statistics is maintained on table variable which means that any changes in data impacting table variable will not cause recompilation of queries accessing table variable.

 

Difference between table variable and temp table:

·         In functions it is only possible to use a table variable and if you need to write to the table in a child scope then only a #temp table will do (table valued parameters allow readonly access).

·         Indexes: If you need an index that cannot be created implicitly through a UNIQUE or PRIMARY KEY constraint then you need a #temporary table as it is not possible to create these on table variables. (Examples of such indexes are non unique ones, filtered indexes or indexes with INCLUDEd columns). NB: SQL Server 2014 will allow non unique indexes to be declared inline for table variables.

·         If you will be repeatedly adding and deleting large numbers of rows from the table then use a #temporary table. That supports TRUNCATE which is more efficient than DELETE and additionally subsequent inserts following a TRUNCATE can have better performance than those following a DELETE as illustrated here.

·         If the optimal plan using the table will vary dependent on data then use a #temporary table. That supports creation of statistics which allows the plan to be dynamically recompiled according to the data (though for cached temporary tables in stored procedures the recompilation behaviour needs to be understood separately).

·         If the optimal plan for the query using the table is unlikely to ever change then you may consider a table variable to skip the overhead of statistics creation and recompiles (would possibly require hints to fix the plan you want).

·         If the source for the data inserted to the table is from a potentially expensive SELECT statement then consider that using a table variable will block the possibility of this using a parallel plan.

·         If you need the data in the table to survive a rollback of an outer user transaction then use a table variable. A possible use case for this might be logging progress of different steps in a long SQL batch.

·         When using a #temp table within a user transaction locks are held longer than for table variables and also it can prevent truncation of the tempdb transaction log until the user transaction ends. So this might favour the use of table variables.

·         Within stored routines both table variables and temporary tables can be cached. The metadata maintenance for cached table variables is less than that for #temporary tables. Bob Ward points out in his tempdb presentation that this can cause additional contention on system tables under conditions of high concurrency. Additionally when dealing with small quantities of data this can make a measurable difference to performance.

 

References:



 

Thursday, May 16, 2013

SSAS Error: Internal error: Invalid ennumeration value.

I had seen few scenario where you perform any action on Analysis Services 2008 or 2008 R2 you get weird Error Message like:
  1. Internal error: Invalid enumeration value. Please call customer support! is not a valid value for this element.
  2. An error occurred while parsing the 'StorageMode' element at line 1, column 6751 ('http://schemas.microsoft.com/analysisservices/2003/engine' namespace) under Load/ObjectDefinition/Dimension/StorageMode.
  3. Errors in the metadata manager. An error occurred when instantiating a metadata object from the file, '\\?\H:\SSAS\OLAP\Data\CubeName\DimensionName.12.dim.xml'
Reason: Database Files / Stores got corrupted.
The possible causes of Corruption:
- If AS Services was restarted during Processing of a Cube.
- If AS Services was restarted during Backup.
- Sometimes Antivirus locks file and if services restarted during that state it can cause corruption.

Resolution:
Method 1 : If you have permission to create OLAP databases in SSMS then deploy to a new SSAS target database, or delete and recreate the target table.

Method 2: If you want to perform an incremental deployment execute DISCOVER_XML_DATA with ExpandFull option in xmla script editor.
The full command syntax is:


 Then in the BIDS OLAP solution file:
  • Right click the dimension reported by the problem and click "View Code".
  • Write a character and delete it so that the file seems modified (has a * by its name)
  • Save the file and process the dimension alone, without processing the cube
  • After this, processing the cude works fine   
Method 3: (Easiest and fastest method)
  • Unprocess the dimension which is reported by the error message, in my case - 'Task Created on Date.dim'.
  • Then deploy the whole OLAP database from BIDS (visual studio).

Method 4:
  • Stop SSAS services.
  • Go into the Data directory, find the ..db.xml file and delete it
  • Delete the corresponding .db directory.
  • Then restart SSAS services.
  • The error should be gone, and you can hopefully redeploy and reprocess it.


How to avoid the file corruption in future:
While doing any maintenance task on Server Like Patching / Server Updates or Restart ensure that no user is connected to Analysis Services, this you can ensure by checking profiler traces.

 This problem has been fixed in SSAS 2012.

Reference:

Tuesday, April 16, 2013

Search for a column in database.

Search for the column in tables or views:
SELECT *

FROM INFORMATION_SCHEMA.COLUMNS

WHERE column_name LIKE '%Total Amount%'

ORDER BY 3,4

 Search for the database object in all the database objects:
SELECT OBJECT_NAME(OBJECT_ID),
definition

FROM sys.sql_modules

WHERE definition LIKE '%Total Amount%'

 
Search for the column only in procedures:

SELECT DISTINCT OBJECT_NAME(OBJECT_ID),

object_definition(OBJECT_ID)

FROM sys.Procedures

WHERE object_definition(OBJECT_ID) LIKE '%Total Amount%'

Tuesday, February 26, 2013

MDX: Last Available value

As the situation explained in my previous post, last-non-empty-value-in-dax we might have to give the last available value of say balance amount in my bank account. So for that we need to travel back in time axis to get the last non-empty value for a given measure at any point in time.

There are couple of methods you can follow depending on the scenario:
Method 1: This is the simple recursive query which gets the last available data from the cube.

WITH
 MEMBER [Measures].[ProdAmount] AS
  IIF(
    ISEMPTY([Measures].[Amount]),
      ([FiscalPeriod].[Calendar].PREVMEMBER,[Measures].[ProdAmount])
      ,[Measures].[Amount])

SELECT [Measures].[ProdAmount] on 0,
[FiscalPeriod].[Calendar].[Fiscal Month] on 1
FROM (SELECT [Product].[Product Name].&[Bike]     ON 0
      FROM [Project])

 One point to remember is the cube should atleast have data for one dimension combination that we are going to query on. If we need the last available Product amount, then for every product in the product dimension we need Amount for atleast one day in time dimension . Else the Measure.ProdAmount goes into infinite loop.

Method 2. You can override the above problem using Nonempty and Tail function: The following query gives the latest product amount given for a particular product for the given year. Here the calculated measure, ProdAmount firstly calculates all the nonempty month, measure set. And then takes the latest month, amount set for the given Product.
If there is no data for a given product for any date in time dimension then the Measure.ProdAmountLA returns value as NULL.
MEMBER [Measures].[ProdAmountLA] AS
TAIL(
  NONEMPTY(
    [FiscalPeriod].[FiscalMonth].CHILDREN*[Measures].[Amount]),1).ITEM(0)


Method 3: There are scenarios were the NULL values in cube is automatically replaced as 0 by default. This happens when the NULLProcessing property of the measure is set as 'Automatic' instead of 'Preserve'. http://msdn.microsoft.com/en-us/library/ms345138(SQL.90).aspx

In this case we cannot use Nonempty function as it will return null values as they are replaced as 0 in the cube. So we will filter out data which are not 0
MEMBER [Measures].[ProdAmountLA] AS
   TAIL(
     Filter(
         [FiscalPeriod].[FiscalMonth].CHILDREN *[Measures].[Amount]
         ,[Measures].[Amount] <> 0),1).ITEM(0)




More Details: http://cwebbbi.wordpress.com/2011/03/24/last-ever-non-empty-a-new-fast-mdx-approach/

Tuesday, January 8, 2013

flaging duplicate without using any temp table and row_id



with TBLCTE(slno,C1,C2,C3,c4,Ranking)
AS
(
select
slno,C1,c2,c3,c4,
Ranking = DENSE_RANK() over (PARTITION BY slno,C1,C2,C3 order by slno,newID())
from trans where C4 <> 'Y'
)
update TBLCTE set C4 = 'Y' where Ranking > 1

select * from trans order by Slno,c1,C2,C3

Monday, January 7, 2013

Excel validate filename


  public void Main()
        {
            // TODO: Add your code here
            string strFileName = string.Empty;
            bool bFileSuccess;
            string strUniqueFileName = string.Empty;

            strFileName = GetSourceFileName(out bFileSuccess);
            if (!strFileName.Equals(string.Empty))
            {
                if (bFileSuccess)
                {
                    LogFileName(strFileName, bFileSuccess);
                    Dts.Variables["pExtractDate"].Value = dtFileExtract;//Set File extract date
                }
                else
                {
                    Dts.TaskResult = (int)ScriptResults.Failure;
                    LogFileName(strFileName,bFileSuccess, "Wrong source file format");
                }
            }
            else
            {
                Dts.TaskResult = (int)ScriptResults.Success;
            }
            Dts.Variables["pVarFileName"].Value = strFileName;

            ////strUniqueFileName = System.DateTime.Now.ToString("dd-MM-yyyy") + System.DateTime.Now.TimeOfDay.Milliseconds.ToString() + ".xls";
     
            //////Create a copy of file
            ////FileInfo fiSourceFile = new FileInfo(strSourceFile);
            ////fiSourceFile.CopyTo(Dts.Variables["InputFileFolder"].Value.ToString() + strUniqueFileName, true);
         
            ////Dts.Variables["TabFileSource"].Value = Dts.Variables["InputFileFolder"].Value.ToString() + strUniqueFileName;
        }
        public String GetSourceFileName(out bool bFileSuccess)
        {
            String strRetSourceFile = string.Empty;
            Boolean blDisableTaskFlag = false;
            String strSourceFolder = Dts.Variables["InputFileFolder"].Value.ToString();
            String strSourceFileTemplate = Dts.Variables["FileNameTemplate"].Value.ToString().ToLower();
            DirectoryInfo diSourceFolder = new DirectoryInfo(strSourceFolder);
            FileInfo[] fiSourceFiles = diSourceFolder.GetFiles();
            bFileSuccess = true;
            foreach (FileInfo fiSourceFile in fiSourceFiles)
            {
                if (fiSourceFile.Name.ToLower().Contains(strSourceFileTemplate))
                {
                    strRetSourceFile = fiSourceFile.Name;
                    String strDatePart = strRetSourceFile.Substring(strSourceFileTemplate.Length);
                    strDatePart = strDatePart.Substring(0, strDatePart.Length - 4);
                    Dts.Variables["FileSource"].Value = fiSourceFile.FullName;
                 
                    //Form File extraction date

                    if (!DateTime.TryParse(strDatePart, out dtFileExtract))
                    {
                        bFileSuccess = false;
                    }
                    blDisableTaskFlag = false;
                    break;
                }
                else
                {
                    blDisableTaskFlag = true;
                }
            }
         
            if (strRetSourceFile.Equals(string.Empty))
            {
                blDisableTaskFlag = true;
                //blDisableTaskFlag = false;
            }
            Dts.Variables["DisableTask"].Value = blDisableTaskFlag;
            return strRetSourceFile;
        }
        public void LogFileName(string strFileName,bool bSuccess)
        {
            LogFileNameEx(strFileName,bSuccess, "");
         
        }
        public void LogFileName(string strFileName,bool bSuccess,string strMessage)
        {
            LogFileNameEx(strFileName,bSuccess, strMessage);
        }
        private void LogFileNameEx(string strFileName, bool bSuccess,string strMessage)
        {
            OleDbConnection sqlConnection = new OleDbConnection(Dts.Connections["ITGWarehouse"].ConnectionString);
            using (OleDbCommand cmdUspFileProcessingDetailsOnBegin = new OleDbCommand("MD.usp_GetFileProcessingList"))
            {
                cmdUspFileProcessingDetailsOnBegin.Connection = sqlConnection;
                cmdUspFileProcessingDetailsOnBegin.CommandType = CommandType.StoredProcedure;
                cmdUspFileProcessingDetailsOnBegin.Parameters.Add("@pExecutionLogId", SqlDbType.Int).Value = Dts.Variables["LogID"].Value;
                cmdUspFileProcessingDetailsOnBegin.Parameters.Add("@pFileType", SqlDbType.VarChar).Value = Dts.Variables["FileNameTemplate"].Value;
                cmdUspFileProcessingDetailsOnBegin.Parameters.Add("@pFileName", SqlDbType.VarChar).Value = strFileName;
                cmdUspFileProcessingDetailsOnBegin.Parameters.Add("@pMessage", SqlDbType.VarChar).Value = strMessage;
                if (bSuccess)
                    cmdUspFileProcessingDetailsOnBegin.Parameters.Add("@pExtractDate", SqlDbType.DateTime).Value = dtFileExtract;
                else
                    cmdUspFileProcessingDetailsOnBegin.Parameters.Add("@pExtractDate", SqlDbType.DateTime).Value = DBNull.Value;
                try
                {
                    sqlConnection.Open();
                    cmdUspFileProcessingDetailsOnBegin.ExecuteNonQuery();
                }
                finally
                {
                    sqlConnection.Close();
                    sqlConnection.Dispose();
                }
            }

        }

Excel retrieve sheet by sheet

Excel connection:

Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\Source\Product\20081202_Product.xls;Extended Properties="EXCEL 8.0;HDR=NO";

"Provider=Microsoft.Jet.OLEDB.4.0;Data Source="+@[User::filenameProduct]+";Extended Properties=\"Excel 8.0;HDR=NO\";"

Provider=Microsoft.Jet.OLEDB.4.0;Data Source=D:\assignment related\Source\Product\20081202_Product.xls;Extended Properties="Excel 8.0;HDR=YES";



  public void Main()
        {
            ////Microsoft.SqlServer.Dts.Runtime.ConnectionManager cm = Dts.Connections["HS_Incidents"];
            ////IDTSConnectionManagerDatabaseParameters100 cmParams = cm.InnerObject as IDTSConnectionManagerDatabaseParameters100;
            ////OleDbConnection conn = cmParams.GetConnectionForSchema() as OleDbConnection;
            ////conn.Close();
         
            string excelFile;
            string connectionString;
            OleDbConnection excelConnection;
            DataTable tablesInFile;
            int tableCount = 0;
            string currentTable;
            int tableIndex = 0;

            excelFile = Dts.Variables["FileSource"].Value.ToString();
            connectionString = "Provider=Microsoft.Jet.OLEDB.4.0;" +
                "Data Source=" + excelFile + ";Extended Properties=Excel 8.0";
            excelConnection = new OleDbConnection(connectionString);
            excelConnection.Open();
            tablesInFile = excelConnection.GetSchema("Tables");
            tableCount = tablesInFile.Rows.Count;

            string[] excelTables = new string[tableCount];
         
            foreach (DataRow tableInFile in tablesInFile.Rows)
            {
                currentTable = tableInFile["TABLE_NAME"].ToString();
                //GetRowsOfCurrentTable(currentTable, excelConnection.GetOleDbSchemaTable(
                excelTables[tableIndex] = "Select [" + currentTable + "].* From " + "["+ currentTable + "]";
                tableIndex += 1;
            }
            excelConnection.Close();
            //System.IO.FileInfo x = new System.IO.FileInfo(Dts.Variables["FileSource"].Value.ToString() );
            //x.Delete();
            //System.IO.File.Move(excelFile, Dts.Variables["ValidSourceFile"].Value.ToString());
            Dts.Variables["ExcelTables"].Value = excelTables;

            Dts.TaskResult = (int)ScriptResults.Success;

        }
    }