Wednesday, October 17, 2012


This error was causing a lot of pain in a Salesforce integration that we have. It seems like a permissions issue on first look. But it's sort of misleading. Even a System Administrator can get this error.

So after researching a lot, this was what I found:

The error is throw when you try to insert/update something that logically cannot be inserted/updated.

Some examples:
  • You try to update a record that does not exist. Maybe the record was never there or it was deleted.
  • You try to update an object field that cannot be set explicitly. These fields can only be updated by the implicetly. eg: object owner, CreatedById, CreatedDate, LastActivityDate, LastModifiedById, LastModifiedDate. You cannot explicitly update these fields.
  • You are trying to give permission to someone but you yourself do not have permission for this.
  • If you are trying to share "Record X" with "User Y" and you yourself do not have access to "Record x", this error happens Or if the "User Y" already has access to it.
These are just a few reasons you can get the salesforce error INSUFFICIENT_ACCESS_ON_CROSS_REFERENCE_ENTITY; I am sure there are others.

Friday, May 25, 2012

Microsoft Text Driver on Windows 2008 R2

I had to set up Microsoft Text Driver on Windows 2008 R2 64-bit machine and was constantly getting errors. Here's what I had to go through to get this to work:

There are 2 different versions of the ODBC drivers on Windows 2008 R2 - 64 bit version and a 32-bit version. When you open the ODBC Datasource Administrator Tool from "Administrative Tools> Datasources (ODBC)", its the 64 bit version that gets opened. I tried to create a DSN using a text driver but did not find the text driver in the list of drivers. So initially I thought that the text drivers were not installed. But when I searched for the msjet40.dll file I did find it in the c:\Windows\SysWOW64 directory. Refer
So this indicated to me that a 32-bit version of the driver was available. This KB article from Microsoft confirmed this:

So I found the 32-bit version of the ODBC Administrator. This is located at: C:\Windows\SysWOW64\Odbcad32.exe

Once I open this file, it opened the 32bit ODBC Administrator interface and it did have the Microsoft text driver listed in there. So I added my datasource and changed my file extensions to use only *.csv.

Now that the datasource was created, I tried to use it. But it threw the following error:
[ODBC Text Driver] Invalid setting in Excel key of the Engines section of the Windows Registry. I was stumped. After much searching on the Internet, nothing seemed to work.
So as a last resort I deleted the datasource and added a brand new one. This time, I left the defaults and it just worked. So I tried it one more time and created the datasource with the file extensions set to *.csv only. it again did not work and threw the same error. Now when I went to edit the file extensions, I saw that the *.csv had been mangled to some weird characters. *.%^767 or something like that. I tried different file extensions but none worked.
So finally to get this to work, I had to leave it to use *.* as the list of file extensions.

Installing ColdFusion MX7 on Windows 2008 R2 64bit

Trying to install ColdFusion MX7 on Win 2008 R2 was giving me all sorts of errors. The server installed without any issues but IIS sites could not be configured to server up cfm pages.

There were a couple of things I had to do to get this to work.

Enabled IIS6 features on Win2008 machine as follows:

  • Control Panel > Programs and Features > Turn Windows Features On or Off >
  • This will open the "Server Manager".
  • Expand Roles>Web Server>
  • On right panel, under "Summary>Role Services>" click on "Add Role Services".
  • In the "Add Role Services" window, select all options under "Management Tools>IIS 6 Management Compatibility"
  • Click on Install.

Change AppPool for the Website to 32-bit

  • Open IIS Manager
  • Expand MachineName> Application Pools>
  • Right click on AppPool name (eg: DefaultAppPool) and select "Advanced Settings"
  • Change "Enable 32-bit Applications" to "true". Click OK

Configure Websites to Run ColdFusion pages

I ran the Web Server Configuration Tool, added a site and got the following error message "Version 7.5 is installed. Supported versions are 4.x, 5.x, 6.x". Basically, Windows 2008 R2 has IIS7.5 and the Web server configuration tool that comes with ColdFusion MX7 was created before IIS7 was created and so it does not support it. So the alternative was to configure the sites manually to run with ColdFusion. This is where I got some help from Sameer of His post ColdFusion Manual Configuration was very helpful.

  • Open \runtime\servers\coldfusion\SERVER-INF\jrun.xml . Find “ProxyService” and change value of deactivated to "false"
  • Restart ColdFusion
  • Create file: \cfroot\untime\ib\sconfig\
  • Edit it as follows:
  • Save the file

  • Extract files from wsconfig.jar into a temp directory
  • Copy jrunwin32.dll and jrun_iis6.dll from extracted files to cfroot\runtime\lib\wsconfig\.
  • Copy jrun_iis6_wildcard.dll from extracted files to cfroot\runtime\lib\wsconfig\1.

  • Create the file cfroot\runtime\lib\wsconfig\1\jrun_iis6_wildcard.ini. Edit it as follows:  

Notes: bootstrap port number for ColdFusion 9 is 51800, for ColdFusion 7 and 8 port number is 51011, for ColdFusion 6 the port number is 51010. 
  • Create the file in cf_root\runtime\lib\wsconfig\1. Edit as follows: 

Note: for MX 6.1; for MX 7/8; for MX9

  • Under the IIS Management Console, click on your computer name, click on "ISAPI and CGI Restrictions" and add two entries: 
one for cfroot\runtime\lib\wsconfig\1\jrun_iis6_wildcard.dll and the other for cfroot\runtime\lib\wsconfig\jrun_iis6.dll. Set permission for both to "allowed". 
You can give whatever name you want; something like "Macromedia Server Extensions"

  • Click on your computer name and click on “Handler Mappings” and add one "Wildcard Script Map" with foll. params:
    Requestpath:  * 
    Executable:  C:\CFusionMX7\runtime\lib\wsconfig\1\jrun_iis6_wildcard.dll
  • After adding wildcard script map, add script handlers for *.cfm files with foll. params:
Request Path: *.cfm
  • Similarly add handler for *.cfc files 
  • Just to be sure, restart ColdFusion as well as IIS. And you should be all set.

Tuesday, February 07, 2012

ColdFusion dotNet integration issues

After installing ColdFusion 8.1 on a new server, I was getting errors related to ColdFusion dotNet integration on pages that called .Net dlls.

It turned out that the server only had .net 4.- framework installed whereas the dll was using .net framework 2.0. So I installed the .net framework and restarted the ColdFusion .Net integration service. But this still did not resolve the issue.

I had to uninstall the "ColdFusion .Net integration service" and then re-install it. This resolved the issue with one of the dll's. So it looks like when  the "ColdFusion .Net integration service" is installed, it somehow only knows about the existence of the existing .net frameworks. Any new frameworks that are installed are not "picked up" automatically. So apparently you have to install the .net frameworks before installing the ColdFusion .Net Integration service.

Now after doing this I was still getting an error with a different dll. The error was:
"DotNetExtension not installed. - DotNetExtension not installed. If a dll is specified in the assembly list, DotNetExtension must also be installed.
It was surprising that one dll would work and another would not. So I looked at the differences between the two dlls, I found that the dll that was not working was calling an external web service over port 80.

When I had re-installed the ColdFusion .Net integration service, I had installed it in a directory outside of the main CF install directory. So if ColdFusion was installed in: c:/coldfusion/ the .Net integration service was installed in c:/CFdotNetService/. Also, ColdFusion was running under a domain account and the .Net Integration Service was running under a local account.
So to me it looked like it could be some sort of permissions issue. And sure, when I gave the domain account (under which CF was running) permissions to the c:/CFdotNetService/ directory, the dll started working.

Tuesday, January 03, 2012

ALTER TABLE Drop Column does not reduce space used - solution for partitioned tables

In SQL Server, when you drop a column you would expect that you will now see the space used by that column to be free. But that is not the case as ALTER TABLE DROP COLUMN is essentially a meta operation. SQL Server marks the column as dropped. But at the physical level, no data is removed.

Infact, even after you drop a column, SQL Server will continue to use space for the dropped column.

Now the solution to this is to do an index rebuild. Or if the column being dropped is a variable length column, you could do a "DBCC CLEANTABLE". But for fixed-length columns, the only solution is to CLUSTERED INDEX REBUILD. But doing a CLUSTERED INDEX REBUILD on a huge table could take a very long time + it needs additional space. Also, if you are not on ENTERPRISE edition of SQL Server, you cannot perform ONLINE rebuild. So basically your table will be blocked for the duration of the index rebuild.

Solution if your table is partitioned and you dont want to rebuild the entire clustered index:
1. If your table is already partitioned, you can simply do a clustered index rebuild on the current/future partitions. This will ensure that all rows going forward will not take space for the dropped column.

Solution if your table is not partitioned and you dont want to rebuild the entire clustered index:
1. You can convert your table into a partitioned table. You would partition on the column on which you have your clustered index. This will only work if your clustered index is on a incrementing int/bigint/datetime field.
ie, create a staging table with the new structure; swap in the existing table into the staging table; rename the existing table to something else; rename the staging table into the existing table.
2. Now you can rebuild the clustered index on the latest partition (rather than the whole partition).

The above solutions are only valuable/useful if you are working with a really huge table (billions of rows). For anything else, it may not make any sense.