Monday, November 20, 2017

Integration Computing Blog

An old Blog Post Site of Mine.

http://integrationcomputing.blogspot.com/

Enjoy!!

David Byrd

Wednesday, November 8, 2017

Powerhouse Integrations – Building Actian Integrations that work, so you don't have to!

Powerhouse Integrations – Building Actian Integrations that work, so you don't have to!

By David Byrd

Recently I wrote an article that was entitled, “Why Actian DataConnect Goes Head to Head with the Big Boys”.  I thought it might be beneficial to show a use-case for why this is important. But first my background. I was actually a customer of the original product when it was called Data Junction back in 1993 at Certified Vacations, and then in 1998 at Putnam Investments. The product did the job it was purchased for.

Then in 2003, I actually went to work for Data Junction when I moved back to Texas. Nine months later the company was purchased by Pervasive where I worked for another three years. Since then I have worked at many of their clients including QuickArrow/Netsuite, Toyota, ADP Total Source, Leprechaun, Deltek, Firstcare, Keta Group, Adaptive Planning (now Adaptive Insights) and finally SFCG. In the meantime, the company has been acquired yet again by another company called Actian.

As a developer and data integrator, there are many concepts that are important when designing a product. Some of these are Repeatable, Scalable, Stability, Reliability, Usability, and more. I have two great examples where Data Connect meets this mark.
The first was a project I started at Deltek for a client company called Keta Group. I built at first about 7 integrations that received that fed into Deltek Costpoint, or pulled data out of Costpoint and sent it outbound to a company using Maximus. Ultimately the requirements were reigned in, and it was decided to use five of the integrations I built. These were implemented in October 2008 and tweaked the next three months, and then very minors changes have been made since then. The great thing about these integrations is that they have been running consistently and reliably for over 6 years without any maintenance for Keta to require me ( or someone trained like me) to support them for that period. This speaks highly for the product in Stability.

Ralph Huybrechts, CFO of the Keta Group, LLC, had this to say, "We contracted with Deltek, the software supplier, who assigned David Byrd to write several Pervasive integrations between the prime and subcontractor’s accounting and timekeeping software and Maximo. David wrote, tested and finalized these integrations in a 45 day phase-in period prior to the start of our large base operations support contract with the Army. This contract requires 350 employees and handles 5,000 service orders per month. The integrations have performed flawlessly since the start of the contract in 2010. "

The second example I recently spoke about in another article called “Web-Services Best Practice: Using parallel queueing to streamline web-service data loads” and how it is important. When I was working at Firstcare I designed an integration that would run Accumulator Webservices messages, as well as others, that were stored in a database table. These messages were created by multiple integrations and fed into the table.   Here is the cool part – a single integration picks up these different types of messages and then sets the connection parameters on the fly from data stored in the table with the message like the URL endpoint, the user & password credentials and seamlessly processes the web service call and stores the response in the table for later processing.  That is this integration connects to multiple Web-Service endpoints without hardcoding the required parameters in the integration.  This speaks highly for the product in Scalability. In fact, Sandeep Kangala, former EDI Consultant at Firstcare, validated this process is still in effect for Accumulators and running without issue.

But it goes further, the same design concept was taken a step further at SFCG using Oracle CX endpoints. We initially built a similar integration to load Oracle Sales Cloud from data provided from exports from CRM on Demand.   However, even with this, we ran into some issues with CRM On Demand Attachments, especially large ones. The bulk export out of CRM OnDemand provide all the small file attachments, but not the large ones. It did though provide their Attachment ID, so we fed these ids into the database via CRM Attachment Export requests in bulk, and each attachment was stored in the database response. So in the case not only was data fed into a web service for loading but also to fetch data out of a web service, all through this single integration. The best part of this is that the integration does not care where the XML Message came from or is going to,  it just takes the message, connects, and sends it, and then stores the response. This is the height of Repeatability.

Chris Fuller-Wigg, Director of Sales Automated Services, stated, "The efficiency gains we experienced when loading data in parallel is kind of unreal, almost 10x faster than serial. We found ourselves losing a whole day for Accounts to load, only to push the button to load Contacts the following day. Cutting out the wait time and letting the system process multiple loads at once allows us to load data 1.5 weeks earlier on average."

Lawrence Chan, Sr Sales Automated Services Consultant, added, "The value of this solution is not only limited to the incredible improvements in data migration speed. With one click, we can have your system's data up to date the day before go live with one click of a button. With proper planning in place, those late nights getting your data up to date will be a thing of the past."

The fact the Actian DataConnect can be found to fulfill the meaning of these terms satisfies the ultimate customer experience. The customer here is two-fold, the first is the developer being able to define and build a trusted flexible integration, and the end-customer getting the data to work the way they want it. This is a win for Actian, a Win for the Developer and a win for the End-Customer.


Web-Services Best Practice: Using Parallel queueing to streamline Web-service data loads

Upgrading to DataConnect from Version 9 to Version 11 --- a better journey.

This was published on the Actian website :  https://www.actian.com/company/blog/upgrading-actian-dataconnect-version-9-version-11-better-journey/

Upgrading to Actian DataConnect 

from Version 9 to Version 11 

--- a better journey.    


I posted an article about a year ago speaking to upgrading from Data Integrator Version 9 to Version 10.  :    Actian DataConnect – The Conversion from v9 to v10 does not have to be scary!!
That article used the v9 or v10 Process object to use a script step to do the magic.  I actually provided code that could be used.
Well now there is the exciting new Actian Data Connect V11.
The first thing I did was look around within the tool.   It looks good, and some what intuitive, especially for previous v9 users.
The next thing I wanted to check out was the import tool.
So first thing I noticed was the file menu had an Import option:




Next it opens a wizard, and I choose to Import a Version 9 Workspace




Press Finish and it does the process to migrate.

Open the V11 workspace and choose what you want.



And I choose :


And it opened this:



And this was the original:



So that is the migration process from V9 to V11.  Much better customer experience than before.

Enjoy.

Coming soon details on migrating from V10 to V11.

Blog articles are Published at:  http://sfcg.com/author/david-byrd/
Other articles: Byrd's Integration Blogs


Check out my other articles :
Actian DataConnect - The Conversion from v9 to v10 does not have to be scary!!
Actian DataConnect Workaround - Working with GMAIL thru the Email Invoker in EZScript
Actian DataConnect Best Practices: Clean up obsolete artifacts before you bring your server down!
Actian DataConnect - Three Reasons Using Actian EZScript Code for sending Emails Should Be On Your Radar

Boomi Integrations: Smart Start for Boomi Extensions for Integrations Connections


Boomi Integrations: Extensions for Connections
by David Byrd

As a data integrator, you spent time putting your Boomi data integration together, and now it is time to move it from your Development environment, to a Test Environment, and then eventually move it into production.  This is how you connect to a On-Premise database, or the Cloud app like Oracle Sales Cloud/Oracle CX .
Step 1 : So the first thing to do is set up the Extensions for the connections.  In the Build tab, Open the process then click on the extension pop-up in the process.
 










You should now see a pop-up for the Extensions, like below:




Notice we are defaulted to the “Connection Settings” tab.  This article is focused on just the “Connections Settings”. 














Now if you click the pull-down for Connections, you all the connections available for this process.  Select one and it shows you all the settings you can modify at a later time.  If you want to be able to change the setting, make sure to check the checkbox in front of each setting that will be modifiable.
Now the Process is ready to deploy.  That will be covered in another article.

Step 2 : Set the extensions for the connections for the Environment, so click on the Manage tab.



Next choose your environment to set.





Then Click on Environment Extensions as shown below:


















This opens a pop-up












Click on the Pull-down and choose the one of the Connections you want to setup the extensions.













You can then set each of the settings that you choose above, or check the “Use Default” to use the original setting setup in development.













As you can see the steps are easy and straight forward bringing the data integrator and nice customer experience.

And that does it.  Watch for new articles for the other types of Extensions used in a Boomi Process, and How to Deploy a process..

Blog articles are Published at:  http://sfcg.com/author/david-byrd/

Other articles: Byrd's Integration Blogs


Friday, March 10, 2017

Defining a Web-services Parallel processing Controller

The following presentation demonstrates a preferred solution I architected at a former Healthcare company.

The problem :   Too many webservices processing and connecting in an unorganized maner without any controls.   Multiple approaches  to loading efficiently caused the web-services API layer to become over-burdened.

The solution:   Build a webservices parallel load controller.

Please review video.

The last two screens defined costs using a real-time queue thru Data connect.

We did not do that.  Instead we used the SQL Server database to house a message controller, and then built processes in DataConnect executed by a stand-alone engine ( not Integration Manager).   Then windows task scheduler was used to run "X" number of loader jobs or queues.










XML Parsing IN SQL Server

COOL SQLServer code of the week
======================================


Hey Integration Fans....

Cool Code of the week.... so using Pervasive I was trying to parse an XML file like the Sample below. It had 125000 Request segments and was taking 45 minutes to process. I changed to delimited text source like I had done at FirstCare, and did my own parsing and that brought the time down by 10%.

The I tried the SQL code below to parse the XML inside of SQL SERVER... and that brought the parsing down to 125 seconds in 2 passes ( one for each request type). Note the savings was from 2700 seconds to 125, a reduction to 4.6% of the original time.

Enjoy this could be really helpful.

David


Sample XML showing schema....

<ETLWebServiceRequests>
<Request RequestType="CONSUMERTAG">
<assign-consumer-tag>
<tag-short-name>RpLexNex</tag-short-name>
<consumer-agency-identifier>1379376</consumer-agency-identifier>
</assign-consumer-tag>
</Request>
<Request RequestType="AREVENT">
<save-arevent-with-shortnames>
<consumer-agency-identifier>1601904</consumer-agency-identifier>
<action-code-shortname>CNSMSCR</action-code-shortname>
<result-code-shortname>SKPINFO</result-code-shortname>
<message-text>Received New score information from Skiptrace Lexis Nexis</message-text>
</save-arevent-with-shortnames>
</Request>
</ETLWebServiceRequests>


And here is the SQL:
----------------------------------

DECLARE @XML XML;
SELECT @XML = CAST((SELECT * FROM OPENROWSET (BULK '\\wasvpdb005\FileImport\LN_PostProcess\LN_Tag.xml' , SINGLE_BLOB) AS x) AS XML);
SELECT RequestNodes,
CAST( RequestNodes AS NVARCHAR(4000)) RequestNodeTxt,
(
SELECT T.c.value('.','varchar(8)')
FROM RequestNodes.nodes('/assign-consumer-tag/tag-short-name[1]') T(c)
) tag_short_name,
(
SELECT T.c.value('.','int')
FROM RequestNodes.nodes('/assign-consumer-tag/consumer-agency-identifier[1]') T(c)
) consumer_agency_identifier
FROM
(
SELECT T.c.query('.') AS RequestNodes
--, T.c.value('../@RequestType','varchar(50)') AS result
FROM @XML.nodes('/ETLWebServiceRequests/Request/*') T(c)
) RequestNodes

SELECT RequestNodes,
CAST( RequestNodes AS NVARCHAR(4000)) RequestNodeTxt,
(
SELECT T.c.value('.','varchar(8)')
FROM RequestNodes.nodes('/save-arevent-with-shortnames/action-code-shortname[1]') T(c)
) action_code_shortname,
(
SELECT T.c.value('.','varchar(8)')
FROM RequestNodes.nodes('/save-arevent-with-shortnames/message-text[1]') T(c)
) message_text,
(
SELECT T.c.value('.','varchar(8)')
FROM RequestNodes.nodes('/save-arevent-with-shortnames/action-code-shortname[1]') T(c)
) action_code_shortname,
(
SELECT T.c.value('.','int')
FROM RequestNodes.nodes('/save-arevent-with-shortnames/consumer-agency-identifier[1]') T(c)
) consumer_agency_identifier
FROM
(
SELECT T.c.query('.') AS RequestNodes
--, T.c.value('../@RequestType','varchar(50)') AS result
FROM @XML.nodes('/ETLWebServiceRequests/Request/*') T(c)
) RequestNodes
 collapse

Submitting a stored Proc Asynchronously


COOL CODE of the week :
    Careful how you use this.   It is powerful and allows you to start a stored proc(2) within a stored proc(1), and conitnue on in the calling proc(1) even while the called proc(2) is still running.   I am thinking you could build one master stored proc to run the Atlas Queues.     Note:   the very bottom of the calling proc(1)  should have a monitoring piece that watches for the status of the Queued procs to be completed ( and yes the called proc(2) would have to write to a table to say it was done.    When all this occurs, then the calling proc(1) can finish.        

http://www.databasejournal.com/features/mssql/article.php/3427581/Submitting-AStored-Procedure-Asynchronously.htm


Using Webservices in Other Databases

And the Oracle Cloud Equivalent : https://cloud.oracle.com/database

And MYSQL : http://open-bi.blogspot.com/2012/11/call-restful-web-services-from-mysql.html

USING SQL to work with WEBSERVICES in SQL Server

COOL SQLServer code of the week - Part 2
USING SQL to work with WEBSERVICES in SQL Server
======================================
Note: the webservice in this does not give a response back.  Not sure what is wrong with the SQL

Hey Integration Fans....


Below is some pretty powerful code that let’s you execute a call to a web-service from SQL-Server directly.

[dbo].[usp_HTTPRequest] - This stored proc makes the connection to the webservice. It should be enhanced probably to handle retry’s/Timeouts/XL Messages and such. 

The parameters on the execution of this proc allow you to put in the URL where the xml message is being sent, the actual xml message, and the other potentially required information such as methodName , SoapAction , UserName, and Password


David 

CODE:
======================================

USE [Work]
GO
/****** Object: StoredProcedure [dbo].[usp_HTTPRequest] Script Date: 8/14/2014 9:48:17 AM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

ALTER proc [dbo].[usp_HTTPRequest] (
@URI varchar(2000) = '', 
@methodName varchar(50) = '',
@requestBody varchar(8000) = '',
@SoapAction varchar(255),
@UserName nvarchar(100), -- Domain\UserName or UserName
@Password nvarchar(100),
@responseText varchar(8000) output )
as
SET NOCOUNT ON
IF @methodName = ''
BEGIN
select FailPoint = 'Method Name must be set'
return
END
set @responseText = 'FAILED'
DECLARE @objectID int
DECLARE @hResult int
DECLARE @source varchar(255), @desc varchar(255)
EXEC @hResult = sp_OACreate 'MSXML2.ServerXMLHTTP', @objectID OUT
IF @hResult <> 0
BEGIN
EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT
SELECT hResult = convert(varbinary(4), @hResult),
source = @source,
description = @desc,
FailPoint = 'Create failed',
MedthodName = @methodName
goto destroy
return
END
-- open the destination URI with Specified method
EXEC @hResult = sp_OAMethod @objectID, 'open', null, @methodName, @URI, 'false', @UserName, @Password
IF @hResult <> 0
BEGIN
EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT
SELECT hResult = convert(varbinary(4), @hResult),
source = @source,
description = @desc,
FailPoint = 'Open failed',
MedthodName = @methodName
goto destroy
return
END
-- set request headers
EXEC @hResult = sp_OAMethod @objectID, 'setRequestHeader', null, 'Content-Type', 'text/xml;charset=UTF-8'
IF @hResult <> 0
BEGIN
EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT
SELECT hResult = convert(varbinary(4), @hResult),
source = @source,
description = @desc,
FailPoint = 'SetRequestHeader failed',
MedthodName = @methodName
goto destroy
return
END
-- set soap action
EXEC @hResult = sp_OAMethod @objectID, 'setRequestHeader', null, 'SOAPAction', @SoapAction
IF @hResult <> 0
BEGIN
EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT
SELECT hResult = convert(varbinary(4), @hResult),
source = @source,
description = @desc,
FailPoint = 'SetRequestHeader failed',
MedthodName = @methodName
goto destroy
return
END
declare @len int
set @len = len(@requestBody)
EXEC @hResult = sp_OAMethod @objectID, 'setRequestHeader', null, 'Content-Length', @len
IF @hResult <> 0
BEGIN
EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT
SELECT hResult = convert(varbinary(4), @hResult),
source = @source,
description = @desc,
FailPoint = 'SetRequestHeader failed',
MedthodName = @methodName
goto destroy
return
END
/*
-- if you have headers in a table called RequestHeader you can go through them with this
DECLARE @HeaderKey varchar(500), @HeaderValue varchar(500)
DECLARE RequestHeader CURSOR
LOCAL FAST_FORWARD
FOR
SELECT HeaderKey, HeaderValue
FROM RequestHeaders
WHERE Method = @methodName
OPEN RequestHeader
FETCH NEXT FROM RequestHeader
INTO @HeaderKey, @HeaderValue
WHILE @@FETCH_STATUS = 0
BEGIN
--select @HeaderKey, @HeaderValue, @methodName
EXEC @hResult = sp_OAMethod @objectID, 'setRequestHeader', null, @HeaderKey, @HeaderValue
IF @hResult <> 0
BEGIN
EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT
SELECT hResult = convert(varbinary(4), @hResult),
source = @source,
description = @desc,
FailPoint = 'SetRequestHeader failed',
MedthodName = @methodName
goto destroy
return
END
FETCH NEXT FROM RequestHeader
INTO @HeaderKey, @HeaderValue
END
CLOSE RequestHeader
DEALLOCATE RequestHeader
*/
-- send the request
EXEC @hResult = sp_OAMethod @objectID, 'send', null, @requestBody
IF @hResult <> 0
BEGIN
EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT
SELECT hResult = convert(varbinary(4), @hResult),
source = @source,
description = @desc,
FailPoint = 'Send failed',
MedthodName = @methodName
goto destroy
return
END
declare @statusText varchar(1000), @status varchar(1000)
-- Get status text
exec sp_OAGetProperty @objectID, 'StatusText', @statusText out
exec sp_OAGetProperty @objectID, 'Status', @status out
select @status, @statusText, @methodName
-- Get response text
exec sp_OAGetProperty @objectID, 'responseText', @responseText out
IF @hResult <> 0
BEGIN
EXEC sp_OAGetErrorInfo @objectID, @source OUT, @desc OUT
SELECT hResult = convert(varbinary(4), @hResult),
source = @source,
description = @desc,
FailPoint = 'ResponseText failed',
MedthodName = @methodName
goto destroy
return
END
destroy:
exec sp_OADestroy @objectID
SET NOCOUNT OFF

Parallel Messages Queues in SQL Server


The following code allows you to build XML messages and dropthem in a set of QUEUES and have SQL Server push them as quickly as they can go through in Parallel Queues

The following Insert statement loads a table with XML messages to be processed by a webservice.
It contains the URL for the message to be sent to.
It contains the MsgReq that is the message to be sent.
It contains the Status, priority, and some Application IDs to enable keeping track of the processed messages.
It contains a QUEUE.  This is assigned by Modding in the example below the ID by the value of 10.  This assigns a integer QUEUE value between 0 and 9.  This value of 10 can be increased to a max of maybe 200 queues.  This really depends on the server.

insert into [dbo].[WS_Messages_Queued]
       ( Queue , Priority , Get_Put_Ind , Url , MsgRsp , EnvGrp , Status , External_FICO_ID , External_FICO_ID_Description, MsgReq)
       select    cast(acct_id as int) % 10      as     Queue                       
              , 1                               as     Priority                    
              , 'P'                             as     Get_Put_Ind                 
              , 'http://' + @SRVR +':8080/CRSTitaniumWebServices/wsservices/tagAssociationService'       as     Url 
              , 'Not Processed Yet'             as     MsgRsp                      
              , @SRVR                           as     EnvGrp                      
              , 2                               as     Status                      
              , acct_id                         as     External_FICO_ID            
                           , 'CONSUMERACCOUNTTAG-agency-consumer-account-id---' + @in_fname                   as     External_FICO_ID_Description
                           ,  cast ('<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:v1="http://www.crsoftwareinc.com/xml/ns/titanium/tag/tagAssociationService/v1_0">'
              + '    <soapenv:Header/>'
              + '    <soapenv:Body>'
              + '     <assign-consumer-account-tag>'
              + '       <tag-short-name>'
              +           case when rectype = 'CPEN' then 'PENDRCL'
                               when rectype = 'CFIN' then 'REQCLTRQ'
                               Else 'Error' end 
              + '</tag-short-name>'
              + '       <agency-consumer-account-id>' + cast([acct_id] as varchar) + '</agency-consumer-account-id>'
              + '     </assign-consumer-account-tag>'
              + '     </soapenv:Body>'
              + ' </soapenv:Envelope>' as text )       as     MsgReq
                           from #temp4
                           where rectype in ( 'CPEN' , 'CFIN'  )


This is the description of the Queued Message Table.
CREATE TABLE [dbo].[WS_Messages_Queued](
       [id] [int] IDENTITY(1,1) NOT NULL,
       [Queue] [int] NULL,
       [Priority] [int] NULL,
       [Get_Put_Ind] [char](1) NULL,
       [Url] [varchar](1024) NOT NULL,
       [MsgReq] [text] NOT NULL,
       [MsgRsp] [text] NULL,
       [EnvGrp] [varchar](32) NOT NULL,
       [SubInd] [int] NULL,
       [Status] [int] NOT NULL,
       [LoadedOn] [datetime] NOT NULL,
       [StartedOn] [datetime] NULL,
       [CompletedOn] [datetime] NULL,
       [External_ID] [varchar](500) NULL,
       [External_FICO_ID] [varchar](500) NULL,
       [External_FICO_ID_Description] [varchar](500) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO

SET ANSI_PADDING OFF
GO

ALTER TABLE [dbo].[WS_Messages_Queued] ADD  CONSTRAINT [DF_WS_Messages_Queued_LoadedOn]  DEFAULT (getdate()) FOR [LoadedOn]
GO





This is the description of the Processed Message Table.


CREATE TABLE [dbo].[WS_Messages_Processed](
       [id] [int] NOT NULL,
       [Queue] [int] NULL,
       [Priority] [int] NULL,
       [Get_Put_Ind] [char](1) NULL,
       [Url] [varchar](1024) NOT NULL,
       [MsgReq] [text] NOT NULL,
       [MsgRsp] [text] NULL,
       [EnvGrp] [varchar](32) NOT NULL,
       [SubInd] [int] NOT NULL,
       [Status] [int] NOT NULL,
       [LoadedOn] [datetime] NOT NULL,
       [StartedOn] [datetime] NOT NULL,
       [CompletedOn] [datetime] NULL,
       [External_ID] [varchar](500) NULL,
       [External_FICO_ID] [varchar](500) NULL,
       [External_FICO_ID_Description] [varchar](500) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

This is the Stored proc that loads the message into the WEB-Service.  It actually calls on  Work.[dbo].[usp_HTTPRequest] in "COOL SQLServer code of the week - Part 2 -- USING SQL to work with WEBSERVICES in SQL Server"

On Yammer see : https://www.yammer.com/sfcg.com/#/inbox/show?threadId=456443782
On Blogger see: 
ALTER PROC [dbo].[sp_WS_Queued_MultiMessageSubmitter]
 (
      @in_Queue int,     
      @in_EnvGrp varchar(32)
 )
AS
  BEGIN
  SET nocount  ON
    DECLARE  @imax INT,
             @i    INT
    DECLARE  @id                            VARCHAR(100),
                      @Queue                         VARCHAR(400),
                      @Priority                      VARCHAR(400),
                      @Get_Put_Ind                   VARCHAR(400),
                      @Url                           VARCHAR(400),
                      @MsgReq                        VARCHAR(max),
                      @MsgRsp                        VARCHAR(max),
                      @EnvGrp                        VARCHAR(400),
                      @SubInd                        VARCHAR(400),
                      @Status                        VARCHAR(400),
                      @LoadedOn                      datetime,
                      @StartedOn                     datetime,
                      @CompletedOn                   datetime,
                      @External_ID                   VARCHAR(400),
                      @External_FICO_ID              VARCHAR(400),
                      @External_FICO_ID_Description  VARCHAR(400),
                      @RowID              int
       declare @xmlOut varchar(MAX)
       Declare @RequestText as varchar(MAX)

--update [dbo].[WS_Messages_Queued] set queue = 1


IF OBJECT_ID('tempdb..#QMsgList') IS NOT NULL drop table #QMsgList
SELECT TOP 250 IDENTITY(INT,1,1) AS RowID, cast(id as int) as ID ,Queue ,Priority ,Get_Put_Ind ,Url ,MsgReq ,MsgRsp ,EnvGrp ,SubInd ,Status ,LoadedOn ,StartedOn ,CompletedOn ,External_ID ,External_FICO_ID ,External_FICO_ID_Description  INTO #QMsgList FROM [dbo].[WS_Messages_Queued] NOLOCK where [Queue] = @in_Queue and EnvGrp = @in_EnvGrp and Status = 2 order by priority , id
    
    SET @imax = @@ROWCOUNT
    SET @i = 1
    
    WHILE (@i <= @imax)
      BEGIN

        SELECT @id                           = id                           ,
                     @Queue                        = Queue                        ,
               @Priority                     = Priority                     ,
               @Get_Put_Ind                  = Get_Put_Ind                  ,
               @Url                          = Url                          ,
               @MsgReq                       = MsgReq                       ,
               @MsgRsp                       = MsgRsp                       ,
               @EnvGrp                       = EnvGrp                       ,
               @SubInd                       = ''                           ,
               @Status                       = 4                            ,
               @LoadedOn                     = LoadedOn                     ,
               @StartedOn                    = getdate()                    ,
               @CompletedOn                  = CompletedOn                  ,
               @External_ID                  = External_ID                  ,
               @External_FICO_ID             = External_FICO_ID             ,
               @External_FICO_ID_Description = External_FICO_ID_Description
        FROM   #QMsgList
        WHERE  RowID = @i

        set     @StartedOn                    = getdate()        
        ------------------------------------------------------

        -- INSERT PROCESSING HERE

        ------------------------------------------------------

                                  set @RequestText= @MsgReq

                                  exec Work.[dbo].[usp_HTTPRequest]
                                   @Url,
                                  'POST',
                                   @RequestText,
                                  'SaveAREventWithShortNames',
                                  '', '', @xmlOut out

                                  --select @xmlOut 
          
        insert into [dbo].[WS_Messages_Processed]
        (
                        [id]
                       ,[Queue]
                       ,[Priority]
                       ,[Get_Put_Ind]
                       ,[Url]
                       ,[MsgReq]
                       ,[MsgRsp]
                       ,[EnvGrp]
                       ,[SubInd]
                       ,[Status]
                       ,[LoadedOn]
                       ,[StartedOn]
                       ,[CompletedOn]
                       ,[External_ID]
                       ,[External_FICO_ID]
                       ,[External_FICO_ID_Description]
              )
              values
        (
                        @id
                       ,@Queue
                       ,@Priority
                       ,@Get_Put_Ind
                       ,@Url
                       ,@MsgReq
                       ,@xmlOut
                       ,@EnvGrp
                       ,@SubInd
                       ,8
                       ,@LoadedOn
                       ,@StartedOn
                       ,getdate()
                       ,@External_ID
                       ,@External_FICO_ID
                       ,@External_FICO_ID_Description
              )


        Delete from  [dbo].[WS_Messages_Queued]
              where  [id] = @id


 --       PRINT CONVERT(varchar,@i)+' Message Processed : ' + @MsgReq + ' RESPONSE:  ' + @xmlOut
        
        SET @i = @i + 1
      END -- WHILE
  END -- SPROC




Next you  define a job for each QUEUE.



So starting a New Job like this:




Fill in Like this for each tab

Tab 1 - General -  the last numeric make the Queue number you are defining

Tab 2 - Steps

Tab 3 Alerts

Tab 4 - Notifications
 Tab 5 - Schedules



Here is some sample SQL Code to build your own JOB.

BEGIN TRANSACTION
DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
/****** Object:  JobCategory [[Uncategorized (Local)]]]    Script Date: 10/30/2014 3:25:09 PM ******/
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

END

DECLARE @jobId BINARY(16)
EXEC @ReturnCode =  msdb.dbo.sp_add_job @job_name=N'WS_DM9_LOADER_Q0',
              @enabled=1,
              @notify_level_eventlog=0,
              @notify_level_email=2,
              @notify_level_netsend=0,
              @notify_level_page=0,
              @delete_level=0,
              @description=N'Webservice Queue 0.',
              @category_name=N'[Uncategorized (Local)]',
              @owner_login_name=N'WPI\dav_byr',
              @notify_email_operator_name=N'WPI_SQL_ALERT', @job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
/****** Object:  Step [Submit Qx]    Script Date: 10/30/2014 3:25:09 PM ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Submit Qx',
              @step_id=1,
              @cmdexec_success_code=0,
              @on_success_action=1,
              @on_success_step_id=0,
              @on_fail_action=2,
              @on_fail_step_id=0,
              @retry_attempts=0,
              @retry_interval=0,
              @os_run_priority=0, @subsystem=N'TSQL',
              @command=N'exec [dbo].[sp_WS_Queued_MultiMessageSubmitter]  0, wasvpap003',
              @database_name=N'WPI_WebServices',
              @flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @job_id=@jobId, @name=N'WS_DM_Loader_Schedule',
              @enabled=1,
              @freq_type=4,
              @freq_interval=1,
              @freq_subday_type=4,
              @freq_subday_interval=1,
              @freq_relative_interval=0,
              @freq_recurrence_factor=0,
              @active_start_date=20140826,
              @active_end_date=99991231,
              @active_start_time=0,
              @active_end_time=235959,
              @schedule_uid=N'ffde7f7d-b038-4ec2-87fe-9b68ae0223d0'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:

GO