Dec 14 2014

IIS asynchronous processing when using Session

Category: Administrator @ 11:33

First some background.  This all came about since we have a new developer using all the latest Javascript front-end coding tools/practices and he is developing using Apache locally since he is not a Microsoft guy.  So until we could get him into Visual Studio he had gotten along using Apache to develop and even demo the application.

Low and behold, his development workstation running Apache is more than double the speed of IIS on a Windows 2008R2 server. He even has more network hops to get the data he requires than IIS since all the resources for IIS are located on the same subnet.

I couldn't figure out how and why Apache was working so much faster.

Here's the how the application works.  The Javascript front-end makes seven asynchronous calls to the web service to bring back all the data for the home page.  Now that we're moving to a stable IIS configured development platform I ported his code to C# and expected that the IIS calls would run the seven asynchronous calls just like Apache.  Well it doesn't

I had run the gamut on this – from IIS worker threads to all sorts of machine configuration and web configuration changes.  None of that works.

Finally the Network Guy to put a Wire Shark on and it revealed that Apache was opening up seven ports to complete the calls to the web service and IIS was only opening one port to communicate with the web service. Until the network guy revealed that this was happening I was completely stumped.

Basically IIS ended up blocking the results from returning since they are all on the same port.  IIS thinks that you’re intending to mess with the Session variable so IIS prevents a race condition.

All you have to do is put a page declarative in to make the Session State Read Only.  We do this on a subsequent (data getter) page after the initial Login page so only the Login uses/stores the session variable. 

All other pages just interrogate the variable and this permits IIS to open as many ports as necessary to satisfy the asynchronous calls.  You’ll probably never ever hit this but it’s good to know that IIS still can do its job.

Here's where I found the answer (thank goodness since managers were ready to jettison IIS):
 

http://johnculviner.com/asp-net-concurrent-ajax-requests-and-session-state-blocking/

 

 

Tags: , , ,

Jun 28 2014

Comparing String Values for Similarities

Category: C# | SQL ServerAdministrator @ 12:41

Many moons ago I embarked on a proof-of-concept project to see if I could use SQL Server to perform a matching process.

It was successful but I still had a lingering suspicion that the underlying algorithm (Levenshtein) used to determine the sameness between human full names was less than optimal.  So what ensued was a period where I would look around the web, once in a while, to see if I missed something. 

I can't tell you how many algorithms I researched, converted to C# and tested.  Don't ask me why, but anyone who starts with Levenshtein will most likely never find this algorithm since so many others like Levenshtein vie for attention.

I finally found what I was looking for purely by accident:

Strike-A-Match: http://www.catalysoft.com/articles/StrikeAMatch.html

and kudos to the paste bin for the C# version: http://pastebin.com/EfcmR3Xx#

So now if you ever need to compare human full names whether they be in any order like:

  • last name, first name compared to first name, last name

Strike-A-Match will do the trick. Take a look at this comparison of a human name: "Jimi Hendrix" to "Hendrix Jimi"

Using Strike-A-Match will compute that these two are exactly equivalent.

Enough said.

I embedded this into A SQLCLR function and it works like a charm.

Levenshtein and all your brethren really don't get the job done when all you really want to do is compare for similarity.

The web has the brightest ideas but try to look in all the dark corners.

Tags: , ,

Feb 19 2013

SQL Server Async Stored Procedures

Category: SQL Server | Stored ProcedureAdministrator @ 08:05

As the saying goes: It's always better to be late than never and that goes double for this technique for executing stored procedures. 

What I'm referring to is a post from Remus Rusan, who explains in detail how to be able to run stored procedures in an asynchronous fashion:

http://rusanu.com/2009/08/18/passing-parameters-to-a-background-procedure/

In the set of two articles, he explains how to take a stored procedure, even with parameters, and run it without holding a client connection.

This is extremely useful if you need the ability for users to run long running procedures. 

You may be asking yourself why not just create a SQL Agent job and run that from the a client which has the same effect since SQL Agent jobs run asynchronously.  Well that's true but this technique is more flexible since you can pass parameters from the client whereas in the SQL Agent jobs there is no facility for passing parameters directly to the job.

He includes a detailed script for installing his helper stored procedures and the procedure for handling the Service Broker.  Yes I said Service Broker.

Don't get overly concerned since to use the SQL Service Broker is quite easy in setup and operational usage.

I won't go into his solution since you've probably already read that but what I offer here is just some small modifications which make it a little bit nicer to use.

To get started all you need is to have the Service Broker enabled by the DBA:

ALTER DATABASE xxxxxx SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
ALTER DATABASE xxxxxx SET ENABLE_BROKER;
ALTER DATABASE xxxxxx SET MULTI_USER;

 

Once this is done you're good to install all the Stored Procedures from Remus.

Check the DBA:

SELECT is_broker from sys.databases where name = 'xxxxxx';

 

One of the modification I made to the set of procedures was to be able to see the Start_Time of the running Stored Procedure.

In Remus's routines, all execution is within a Transaction which proves invaluable since you'll need the rollback capability in the case of failure.

But the Start_Time I don't think qualifies as a need for the rollback of the transaction plus I want to display the Start_Time to the users who requested the run of the procedure.

So all I did was include an execution of a CLR procedure to update the Start_Time.  Since CLR procedures execute outside the transaction it updates the Start_time immediately.

I made the change here in the AsyncExecActivated procedure:

 

begin try;
    receive top (1).......

if (@messageTypeName = N'DEFAULT'....

--CLR procedure
EXECUTE ExecuteSQLNoQuery

 

I also included a way to cancel a waiting procedure which is next in line to be handled by Service Broker execution.

It's crude but it allows you to stack up requests and act on the behalf of the user to cancel inadvertent requests.

Just update the error_message with anything (before it runs) for that task and it will effectively be cancelled.

select @error_message = error_message from dbo.tblTaskResults
      where [token] = @token;

--see if the request was canceled
if (@messageTypeName = N'Default' and @error_message is null)

 

I've put together an ASP.NET Page which allows users to initiate and cancel queued requests:

 

I've included my script for those who would like to build onto my recipe for Asynchronous Procedure Execution:

http://sdrv.ms/12r1d7C

I drive my whole web page and the Tasks available for execution based solely upon a SQL configuration table.

This technique frees the developer to add new tasks at anytime and no longer do you need to add more SQL Agent jobs.

Developers need time to drive into subject matter.  Heads down developing is a trap.

Tags: , , ,

Sep 30 2012

SSIS Package Execution with C# -- SQL Server

Category: C# | SQL ServerAdministrator @ 08:16

There are times when it seems like IT management decisions are arbitrary and capricious.  This is one of them!

As with most shops, we have SQL Agent running the SSIS production packages but when we migrate to newer servers we are now instructed to no longer use the SQL Agent for execution on the SQL Server. 

OK -- so what's the substitute? 

I'm told to use Bat files which are executed through some other agent tool or use xp_cmdshell.

I feel like I'm going backwards in time and not moving forward with how an operational environment should be architected.  But thankfully some clever individuals have already paved the path to a more beautiful world.

And that world is C# execution of the packages.  You may be asking what the heck I'm talking about but SSIS packages can run anywhere.  I know this sounds unconventional but you can execute packages as long as you have the dependent dlls and the appropriate .NET framework in that environment.  Take a read through this following blog entry and you'll quickly see how to get SSIS packages running from any Windows server and not just a Windows server running SQL server:

Running Packages from C#

The beauty of doing it this way is that you now can have any Windows server running a service which executes the packages and base that execution upon a database configuration. 

Benefits:

  • Configuration files:
    • Running the SSIS package through C# allows you to pass in "User Variables" to the packages.  So just read what you want from a SQL configuration table and pass it to the package.
    • That's right, no longer do you have to maintain config files in folders but move that maintenance to a configuration table in SQL server.
            pkgLocation = Path.Combine(pkgLocation, pkgName.Replace("\"", ""));
            DtsLogging mylogger = new DtsLogging();
            mylogger.Initialize(pkgName);
            Application app = new Application();

            //Package pkg = app.LoadPackage(pkgLocation, eventListener);
            Package pkg = app.LoadPackage(pkgLocation,null);

            pkg.Variables["User::DmatchDataSource"].Value = pkgDmatchDataSource;
            pkg.Variables["User::DmatchUserId"].Value = pkgDmatchUserId;
            pkg.Variables["User::DmatchPassword"].Value = pkgDmatchPassword;
  • Error handling:
    • Make a consistent approach to your applications for error logging.  An errors collection is exposed from the C# package execution so that you can keep all your application logging in one place.
    • No more looking at an application log for one thing and SQL Agent history for another event.

In my case I wanted to capture the rows being sent across the wire (OnPipelineRowsSent) to SQL Server so now that can be captured with the Logging enabled:

            pkg.LoggingOptions.EventFilterKind = DTSEventFilterKind.Inclusion;
            pkg.LoggingOptions.EventFilter = new string[] { "OnPipelineRowsSent" };

            DTSEventColumnFilter ecf = new DTSEventColumnFilter();
            ecf.MessageText = true;
            pkg.LoggingOptions.SetColumnFilter("OnPipelineRowsSent", ecf);
            pkg.LoggingMode = DTSLoggingMode.Enabled;
   
            DTSExecResult pkgResults = pkg.Execute(null,null,null,mylogger,null);

Here is the DTSLogging class:

internal class DtsLogging : IDTSLogging

  {
      public bool Enabled
      { get { return true; } 

      }

      ulong rowsprocessed = 0;
      Stopwatch stpWatch = new Stopwatch();
      string pkgName = "";

 

      public void Initialize(string Pkgname)

      {
          pkgName = Pkgname;
          stpWatch.Reset();
          stpWatch.Start();
      }

 

      public void Log(string eventName, string computerName, string operatorName, string sourceName, string sourceGuid, string executionGuid, string messageText, DateTime startTime, DateTime endTime, int dataCode, ref byte[] dataBytes)
      {
          switch (eventName)
          {
              case "OnPipelineRowsSent":
                  {
                      if (messageText == null)
                      {
                          break;
                      }
 
                      if (messageText.StartsWith("Rows were provided to a data flow component as input."))
                      {
                          string rowsText = messageText.Substring(messageText.LastIndexOf(' '));
                          ulong rowsSent = ulong.Parse(rowsText);

                          if (messageText.Contains(" OLE DB Source Output "))
                          {
                              LogRowProcessedInfo(rowsSent);
                          }
                      }
                  }
                  break;
          }
      }


      public bool[] GetFilterStatus(ref string[] eventNames)

      {
          //bool[] boolret = {};
          return new bool[] { };
      }


      void LogRowProcessedInfo(ulong rowsSent)
      {
          rowsprocessed += rowsSent;
          // Include further implementation for logging to db and text file.
          if (stpWatch.Elapsed.Minutes >= Convert.ToInt32( Config.Instance().EventMessageTimeInterval))
          {
              stpWatch.Reset();
              eventLogSimple.WriteEntry("Pkg: " + pkgName + ", PipelineRowsSent: " + rowsprocessed.ToString());
              stpWatch.Start();

          }         
      }

  }

In my case I made a Windows service to run the packages.  This way the C# code looks at a database to schedule when to execute a particular SSIS package.  The dtsx files are kept locally on the application server and the C# code loads them and runs them locally.  This ends up using the resources of the application server and there's no resource impact (my DBA loves this fact) felt on the SQL server.

 

Sometimes from miserable circumstances comes inspiration.

Tags: , , ,

May 20 2012

SQL Server 2012 -- Full-Text Search (Matching Engine)

Category: Administrator @ 06:48

Having been immersed in a Full-Text Search (FTS) proof-of-concept over the past two months I thought others would benefit from this experience.  First, let me start by saying that the way this came about was rather unusual.  A co-worker of mine had started to play with FTS and turned around one afternoon and asked me for some T-SQL help.  When I saw the language constructs of FTS that he was using I had what I call a h$!y s^!t moment.  The last time that I looked at using anything close was with SoundEx which was a mess for my purposes.  I quickly saw how I could make great use of this technology.  And I'm only scrapping the surface in what I'm doing but check out the video link below for more on what FTS is supposed to (documents) be used for:

http://channel9.msdn.com/Events/TechDays/Techdays-2012-the-Netherlands/2297

 

In my day job we do lots and lots of matching.  By that I mean we get in files with a person's name and a name of a piece of copyrighted material.  So internally we have a database of person's names and their related pieces of copyrighted material.  What we need to do is take the incoming data and find the match in our database.  Sounds easy right.  Well not with our current technology.

My thought was to take FTS and use it on the database of data we currently have and then take the incoming data and try to find the closest match.

If you think you should deploy this on SQL 2008R2 then you'll not benefit from the changes in 2012 which increased performance dramatically.  And performance is the key to this whole matching process.

In my matching tables, I have over 44 million rows to search, bring back results and score the results.  I was able to attain just over four attempted matches per second.  Not bad considering that I'm running the Developer edition on my local workstation with just two crappy SATA drives.  And don't worry about space consumption since for my test case of 44 million rows, I used just under one gig for the FT index after the population of the FT index completed.

Let start:

First you'll need to install FTS which is no big deal and I'll leave that to you.

 

Now the important steps:

  • Create a set of FileGroups (one for each table you'll be using to search/indexing.)  The best case scenario is to have many disks and split the table's clustered index on the searched tables away from the FT index.
-- Add FILEGROUP(s)
ALTER DATABASE Dmatch ADD FILEGROUP fgDmatchX;
GO
ALTER DATABASE Dmatch ADD FILEGROUP fgDmatchY;
GO

ALTER DATABASE Dmatch 
ADD FILE 
( NAME = DmatchdatX,
  FILENAME = 'X:\MSSQL\data\DmatchdatX.ndf',
  SIZE = 20000MB,
  MAXSIZE = 28000MB,
  FILEGROWTH = 1000MB)
TO FILEGROUP fgDmatchX

go

ALTER DATABASE Dmatch 
ADD FILE 
( NAME = DmatchdatY,
  FILENAME = 'Y:\MSSQL\data\DmatchdatY.ndf',
  SIZE = 20000MB,
  MAXSIZE = 28000MB,
  FILEGROWTH = 1000MB)
TO FILEGROUP fgDmatchY

go

--ALTER DATABASE Dmatch 
--REMOVE FILE DmatchdatX

--ALTER DATABASE Dmatch 
--REMOVE FILE DmatchdatY

 

  • Create a FTS Catalog:
USE Dmatch;
GO

--DROP FULLTEXT CATALOG DmatchWrkPtyCatalog
CREATE FULLTEXT CATALOG DmatchWrkPtyWrtCatalog WITH ACCENT_SENSITIVITY = OFF AS DEFAULT ;
GO
USE Dmatch;
GO

--DROP FULLTEXT CATALOG DmatchWrkPtyCatalog
CREATE FULLTEXT CATALOG DmatchWrkPtyPubCatalog WITH ACCENT_SENSITIVITY = OFF AS DEFAULT ;
GO

 

  • Create a StopList -- for this example I'm not going into what entirely I added and took away from the StopList but I'll include the SQL to show how:

 

--drop fulltext stoplist Dmatch1StopList;
GO
Create FullText StopList Dmatch1StopList from System Stoplist;
GO

  • Modify the StopList to suit your needs:
alter fulltext stoplist Dmatch1StopList  drop 'about' language 1033;
alter fulltext stoplist Dmatch1StopList  add 'company' language 1033;
 
  • Create a Full-Text Index

 

--DROP FULLTEXT INDEX ON tblWrkPtyWriterSearch

--Create FTS index on work and Pty name

CREATE FULLTEXT INDEX ON tblWrkPtyWriterSearch(WrkNa,PtyNa) 
   KEY INDEX ix1WrkPtySearch on ([DmatchWrkPtyWrtCatalog], FILEGROUP [ fgDmatchY])
   WITH STOPLIST = Dmatch1StopList;


--DROP FULLTEXT INDEX ON tblWrkPtyPublisherSearch

--Create FTS index on work and Pty name

CREATE FULLTEXT INDEX ON tblWrkPtyPublisherSearch(WrkNa,PtyNa) 
   KEY INDEX ix1WrkPtySearch on ([DmatchWrkPtyPubCatalog], FILEGROUP [fgDmatchX])
   WITH STOPLIST = Dmatch1StopList;
   
GO

 

 
  •  Check the status of the Population of the FT index
-- Number of full-text indexed items currently in the full-text catalog 
-- plus the population status and size:

SELECT 'DmatchWrkPtyWrtCatalog'
,FULLTEXTCATALOGPROPERTY('DmatchWrkPtyWrtCatalog', 'PopulateStatus') AS [Populate Status]
,FULLTEXTCATALOGPROPERTY('DmatchWrkPtyWrtCatalog', 'ItemCount')AS [Item Count]
, FULLTEXTCATALOGPROPERTY('DmatchWrkPtyWrtCatalog', 'IndexSize')AS [Size in MB];

SELECT 'DmatchWrkPtyPubCatalog'
,FULLTEXTCATALOGPROPERTY('DmatchWrkPtyPubCatalog', 'PopulateStatus') AS [Populate Status]
,FULLTEXTCATALOGPROPERTY('DmatchWrkPtyPubCatalog', 'ItemCount')AS [Item Count]
, FULLTEXTCATALOGPROPERTY('DmatchWrkPtyPubCatalog', 'IndexSize')AS [Size in MB];

 

  • Create procs and functions to return possible matches:

What I did here was to use the ContainsTable-SQL in conjunction with other functions to return a result of the top (x) matches.  I did the matching/scoring with help of the Levenshtein algorithm.

My overall match rate was just over 73% and of excellent quality, meaning that I didn't match to something that was not supposed to match.

I'll continue the next blog entry with the actual stored procedures and functions that really do the work.

Full-Text Search on SQL 2012 is a gift.

 

 

 

 

 

 

 

Tags: , , , , ,