Skip to content

Scribe stopped working with CRM Online and won’t connect

This seems to be happening more and more to clients on CRM Online as Microsoft moves them from the LiveId to the O365 platform, you may see it as a billing change or just an authentication/logon change but symptomatically, what you may be seeing is that Scribe jobs either start failing or are not running. When you try to open a dts file, it just hangs forever but never connects or it may attempt to connect and give you the failed screen with an option to fix it.

If you open the dts and it just hangs and never connects, you will need to use the dtsedit.exe program to open each and every dts file and edit the connection information. If you need to use dtsedit to get the connection changed, open the executable dtsedit.exe which will be found in your <root>\Scribe\ folder. Once dtsedit is open, you can open a dts file. Once it has loaded, click on the “Data Providers” tab and in there you will find the settings for your Dynamics CRM adapter. The columns are called “Connection Property” and “Connection Value”. Connection Value is what you’ll need to update.

If you can open the dts and get the connection information while in the workbench, then you can simply change the url information there.

In either scenario you will need the following Dynamics CRM 2013 Connection Information:

Deployment = Online Server URL:
Organization: yourorgname
Password: userpassword

I would also suggest upgrading to the Scribe CRM Adapter v. 5.5 as it has native support for 2013. Do not upgrade if you are running Dynamics CRM 2013 On-Premise however.

If you need support for Microsoft Dynamics CRM 2013, McGladrey offers a full range of services from implementation and optimization to development and support. Contact our professionals for more information on our services at 855.437.7202 or

By: Josh Thompson – Minnesota Microsoft Dynamics CRM partner

eConnect Procedure Fails – String or Binary Data Would Be Truncated

Recently I was working on some customized GP integrations, specifically, around contracts (SVC tables). When I created the procedure in T-SQL, it worked great, once I created a job to do some work on a daily basis, it started failing with error “String or binary data would be truncated”. Well I know what that means, but the question is, why? Turns out I had run into this once before when executing from within the query window in SQL server. As it turns out, there is a column called CRUSERID in Great Plains that is defined as a char(15). My login name was over that. GP was using SUSER_NAME to get the logged in user to pass that information to a table I wasn’t even directly accessing. In this case it was SVC00604.

SQL profiler was no help as the proc was called from a trigger and is encrypted within Great Plains. I ended up stepping through every line of the create contract code and finally realized it was the user id that was causing problems. Mine happened to be domainname\jthompson and it exceeded the 15 char limit.

In the case of the SQL job, the SQL Server Agent account was being used. At this client, it was NT AUTHORITY\SYSTEM, which was also too long. We changed the sql agent to login with a domain account that was <15 chars total, best practice anyway, and it solved the problem! To be clear, the entire total login length (domain\login) has to be < 15 chars.

So if your code is working fine from your dev environment and starts failing with this error once it’s scheduled as a job in SQL server, you’ll know where to look! I hope this helps and saves you all the time I spent trying to debug this curious error.

Concatenating Parameters In A SQL Statement Are Empty

Here is one of those things that happened and once I saw it, I knew it was wrong but I spent way more time debugging than I should have.

Here’s the scenario:
Stored procedure was to load some data from a staging table and create some contracts in GP. For some reason one of the values was being assigned improperly. The value for record 1 was being put into record 2 but all of the other values were correct. Upon further investigation, I noticed that the very first record was empty and it’s value was subsequently put (incorrectly) into record two and that continued for all of the records.

Code looked something like this:

Declare @parm1 varchar(10),
@parm2 varchar(10),
@parm3 varchar(30)

@parm1 = column1
, @parm2=column2
, @parm3 = @parm1 + char(13) + @parm2

So here I was debugging the code only to realize the assignment of the params that were being used were not really “available” until after the select was completed which is what was causing our problem. A simple change to @parm3 = column1 + char(13) + column2 fixed the problem and life was good again.

Don’t you love inherited code?

So now the question is, how do I tag this so others can find it. My variables have values but concatenating failed ? Leave me a comment with how you found this post and if it helped, what might have made it easier to find. Then again, maybe nobody else has ever done this or inherited code that did and I’m all alone in this one, it wouldn’t be the first time.

How To Configure Database Mail with GMail sp_send_dbmail

I know there are a ton of posts out there on this but for some reason it took me a few separate blog articles to get this working. My point here is to try and consolodate what I had to do in order to get my instance of SQL 2008 R2 working with GMail SMTP so I could use the sp_send_dbmail proc and hopefully accelerate your ability to use the feature. I needed this so I could work with a local install of SQL 2008 R2 and test out some mail functionality for a client. The email server at my work wouldn’t allow me to relay so it seemed gmail was my only option.

Here are two of the best blog posts I used to get me pointed in the right direction:

This one is from 2005, it’s useful but the settings I had to use were just slightly different and this article isn’t specific to configuring database mail. I’ll cover those changes later in this post.

This post from Pinal Dave was very helpful as well. I just needed to get the two pieces working together.

One of the most important things from Pinal Dave’s blog is running the sp_confgure statements that enable database mail.
sp_CONFIGURE 'show advanced', 1
sp_CONFIGURE 'Database Mail XPs', 1

So after I merged those two posts together, I was close…so very close. A couple of things I still had to do:
1. When configuring the  GMail account, make sure the following settings are configured properly
    Reply e-mail: This will be set by gMail anyway so I don’ t think it makes a difference what you put here.
    Server Name:
    Port Number: 587 (This is a change as port 25 would not work for me)
    This server requires a secure connection (SSL): I had to have this checked or it would not work. You cannot configure TLS but having this checked worked for me.
    Basic Authentication:
        User Name:
your gmail user name.  Mine did not work when I included the as mentioned in the previous post. 
        Password: Yep. Your gmail password. Nothing tricky here…
2. I had to make the profile I created the default profile. 
    To make the profile the default: Right Click “Database Mail” in Management Studio -> “Configure Database Mail” -> Click Next -> “Manage profile security” -> Click Next.
    Now you should see the profiles that have been created. I changed the only profile I had to “Yes” under “Default Profile” and checked the “Public” box. Click Next and then Finish. Oddly enough, if you supply @profile_name you shouldn’t need a default profile specified. I was providing that parameter and it was spelled correctly yet it gave me errors. According to BOL it was a total coincidence.

To execute it and see if it works you simply exec sp_send_dbmail which is found in the msdb database.

The parameters are well documented in BOL but here are a couple of quick tips to get you started:
To get the results of a query to show up in the body of your email you need to specify the @execute_query_database,  @query parameter and @attach_query_result_as_file = 0. To send it as a .txt attachment, just change the @attach_query_result_as_file = 1

You can also lookup the status of your emails by querying one of the sysmail_ tables/views in the msdb database. If you capture the output parameter @MailItemId from the sp_send_dbmail then that id can be passed to any of the objects to get the specifics on that particular mail message.  Note however that the “sent_status” doesn’t get updated real-time. It may show “failed” or “retrying” if it has problems initially but “sent” isn’t a given right away even though it may send fine. This is good to know if you try and raise a message indicating message failure/successs. In my scenario, I could only see “unsent” as a value right away but a few minutes later it would send. You can adjust your approach based on your needs but bear in mind the successful send isn’t available immediately after calling the stored procedure. Also, I’ve seen only those four values in the sent_status (“sent”, “unsent”,”failed”, “retrying”). There may be others…



Configuration Tables:

I hope this helps you accelerate getting your database mail configured with gmail!

-Josh Thompson

Our McGladrey team blog is live!

We’ve got our Minneapolis based McGladrey Dynamics CRM team blog up and running! So you’re probably asking yourself, “What am I going to find there and how does is it different from this blog?”. Well that’s a great question! Luckily, the answer is pretty simple. On this blog, all posts will be made only by me and don’t necessarily reflect the opinions of my employer, it will stay technical in nature but I will also post some things here that may blur (or obliterate) the line of “Microsoft Supported”, I will not post those on our McGladrey team blog. Our McGladrey team blog will have posts from all our Dynamics CRM consultants! There is a lot more expertise and experience in things I may not have had exposure to so you’ll want to see what the rest of my co-workers have to offer!

I hope you take a moment to check out our team blog and leave comments or ask questions! Here is the link to the team blog:

I hope to see you over there too!

Find and Update the Default Organization For a User in CRM 2011

Note: This is not a Microsoft recommended approach as it directly updates the database and is therefore not supported.
The script below has been floating around for a few months in various incarnations, I cleaned it up and made it a little more readable and reusable (at least that was my goal). I don’t take credit for finding this only for modifying the script to be a little more user friendly. (WordPress modifies the text so when you paste this, the ‘ may show up as ` and need to be replaced. Once I figure out how to fix that I will do so. Also the double hyphen for comments comes out as a single – hence the /* */ around all single line comments.

/*Find the default organization for a user*/
@DomainName nvarchar(255)
,@OrgCRMDBName varchar(255)
,@SystemUserId uniqueidentifier
,@SQLString nvarchar(500)
,@ParmDefinition nvarchar(500)

/*Configure the @Domainname and @OrgCRMDBName values*/
SET @DomainName        = ‘<domain>\<user>’ –Set this to the domain\user you want to find
SET @OrgCRMDBName = ‘<ORG>_MSCRM’ –Set this to the ORG_MSCRM DB Name

/*Do not change these*/
SET @SQLString            = N’SELECT @SystemUserIdOUT = SystemUserId FROM ‘ + @OrgCRMDBName + ‘..SystemuserBase WHERE DomainName = ”’ + @DomainName + ””;
SET @ParmDefinition = N’@SystemUserIdOUT uniqueidentifier OUTPUT’;

EXEC sp_executesql @SQLString, @ParmDefinition, @SystemUserIdOUT=@SystemUserId OUTPUT;

/*Identify the DefaultOrganization and settings for a given user*/
                     FROM MSCRM_CONFIG..SystemUserOrganizations
                     WHERE CRMUserId = @SystemUserId )

/*Find the organizations that have been created.*/

/*Update the organization for a given user*/
UPDATE SystemUser SET DefaultOrganizationId = ‘<NewOrgId>’ WHERE Id = ‘<UserId>’

If you are not comfortable doing this directly to the database Mohammad Atif has another method that will work and is described here

MSCRMAsyncService Errors in the Event Log

In continuing with my earlier post, I’m going to keep working on adding various performance tips and tricks as they relate to CRM 2011. I’ve spent the last 8 or 9 days working with a great engineer with Microsoft support and I’ve taken away some good information that I’ll pass along.

One of my clients has been experiencing the MSCRMPlatform warning in their event viewer for quite a while. (Click the link to see my earlier post).  In addition to that warning, they had an error that would crop up somewhat randomly. This error was logged to the event viewer as:
“A Microsoft Dynamics CRM Asynchronous Processing Service operation of type 12 has been suspended”.
We thought it might be a workflow or job running but disabling the jobs didn’t resolve the issue and disabling all of the workflows wasn’t an option. The error itself was hard to trap and debug so we were left with the notion that somehow the warnings and error might be related. Inside the event log message for the warnings were a few queries (Update, Select and Insert). Those queries all went against the AsyncOperationBase table. Now there is at least one hotfix/kb article surrounding this table KB968520 . This helped, but it didn’t fix the problem. After trying a number of different fixes, I decided to run the query through the SQL Server Database Tuning Advisor and to run each query through the Estimated Execution Plan. What I got back amazed me… The execution plan recommended two different indexes, one of which was to provide upwards of a 98% increase in performance. The second, was a 28% increase and the DTA recommendations were to provide an expected 14% improvement. We chose to use the index recommendation from the DTA but ignored the CREATE STATS recommendations (on the advice of the SQL engineer with MS) and I implemented both indexes from the execution plan recommendation. As it turns out, we went from having anywhere from 10-30 warnings over a 2-5 second time window to having NONE! Prior to the index changes, the queries were exceeding the threshold by 1-70 seconds, now they are all running under 10 seconds. Even better, the MSCRMAsyncService errors went away at the same time.

So, if you are seeing a bunch of warnings from the MSCRMPlatform and MSCRMAsyncService errors described above, try looking at the indexes or running your queries through the DTA and execution plan tool that Microsoft provided and you might be just have the same success we did.