QuantumView API with AbleCommerce

Recently a client requested a challenging project.  They wanted to fully automate the processing of tracking numbers pulled from the QuantumView API data feeds.

The majority of the client’s shipments are drop-shipped direct from the manufacturer or distributor who charge my client’s UPS account number.   This is a technique known as 3rd-party billing.  This works well because my client can leverage all of their shipping volume with UPS instead of just the in-stock shipments.   And they know what to expect for shipping charges since it’s their account being billed.

However there is a downside to this shipping process.  Normally an integration like ShipStation or WorldShip would post tracking numbers back to AbleCommerce.  However with drop-ship orders, these tracking numbers must all be hand-entered on every shipment.   A major hassle when your order volume is significant.

The client’s workaround was to create two feeds in their QuantumView account.  One feed for in-stock shipments and the other for all 3rd-party (dropship) shipments.  Each day the client would download the feed, load it into Excel for readability purposes and then copy/paste tracking number data.   A huge pain.

The design for this integration needed to meet the following requirements:

  • Download the feed files on a regular basis from the UPS QuantumView API
  • Locate the Origin record types and identify the shipment Id for each one
  • Post the associated tracking number to the shipment Id
  • Set aside any Origin records that could not be automatically processed
  • Provide a UI for manually processing Origin records which could not be automatically processed
  • Log a summary of each interaction with the QuantumView API for debugging purposes

The first step to the project was to dig into the UPS QuantumView API documentation.  The docs clearly show the API supports both a JSON and XML payload.  However I was never able to get the JSON payload to work properly.  The problem came down to arrays/collections.  When a specific element in the API specifications supported multiple items, I would naturally define it as an array i.e. element[].  But when the element had only one entry, UPS would not send the element as an array with a single entry.  UPS would simply send a standard element definition.   This broke my class structure in C# – I couldn’t get JSON.Net to handle a element that could arrive as a single piece or as an array of pieces.

After several frustrating hours I was forced to give up and switch to the XML format of the API.  The documentation from UPS for the XML API was vastly more detailed.  Clearly it seems they added JSON as an afterthought.

I wasn’t looking forward to the prospect of hand-building a class that matched the XML structure of the API response.  Fortunately, I found an awesome trick that involved using a command-line tool from Visual Studio.  Just hand the XML file to the tool and *poof* it generates the necessary class and child classes to consume that XML layout.  Later on I found that feature actually exists right in Visual Studio – just copy the XML to your clipboard and then use Edit/Paste Special/Paste XML as Classes.  SWEET!

Once I had the XML importing into strongly typed classes, progress moved much faster.  I was able to leverage HangFire to automatically poll the UPS API on schedule controlled through an admin settings page.   Now that automation is handled, I moved on to processing the downloads.

Testing the API proved a bit confusing because of one REALLY important detail:  The default API endpoints only deliver what transactions have been received by UPS since the last time you asked UPS for them.  Helpful in a production scenario since you never have to worry if you’ve already received a specific Origin record.  But it makes testing your code a major source of stress.  Your first request comes back with data, and then suddenly you get nothing.  Over and over again, what you thought was working now appears to be broken.  The way around this is to add a date-range criteria (up to 7 days in the past).  When a date range is included in the request, QuantumView will respond with all transactions regardless if they’ve already been delivered or not.

Now that I’m pulling down data, the processing became simple.  The client was already having each manufacturer put the AbleCommerce Shipment Id in one of the UPS reference fields.  UPS manifests support up to five ‘reference’ fields usable by the shipper for any purpose.  Having the Able Shipment Id in each Origin record makes it a snap to determine which Able shipment gets the tracking number.

 

Leveraging the Gift Wrap feature for optional upcharges to a product

Just goes to show, you still can teach an old dog new tricks.

A client needed a way to add an Engraving choice to the product page.   Normally this isn’t an issue, just use the product variant feature.

But in this case, the engraving choice must be optional.   And it has to charge an amount.  And it needs to accept some text for the actual engraved words.

So after a little digging, I settled on a little-used feature in AbleCommerce known as Gift Wrap.    Gift Wrap is a way to associate a secondary charge to any product.   The basket page and the checkout pages automatically know how to handle Gift Wrap. 

By default, AbleCommerce handles Gift Wrap selection during checkout.   So the first modification was to get gift wrap to apply on the product page.   This means the BuyProductDialog control is the place to start.    I used a checkbox to render a separate panel that included the text box control.    The contents of the text box control are set to the giftmessage property available on each line item object.

Then it’s just a matter of setting the correct Id for the gift wrap on the line item before it’s added to the basket.

Basket page needed some text changes to replace “Gift Wrap” with “Engraving”…super easy to do.

Checkout needed modified to skip the gift wrap page normally encountered by default AbleCommerce behavior.

At that point, I’m done – the client can now easily associate an optional upcharge for Engraving on specific products within the store catalog. 

Building an Audit Log with nHibernate Listeners in AbleCommerce

I have a client who can’t seem to figure out who’s making edits to certain products. Somebody changed the price, somebody marked it hidden. That sort of thing. Too many people in the back-end and no audit trail for who changed what.

So I decided to dig into nHibernate listeners. After a brutal all-nighter reading StackOverflow posts and copying snippets from a few blog posts, I actually got it working.

I added some new handlers for the Post-Commit-Create, Post-Commit-Update and Post-Commit-Delete listener events and pointed them to my AuditEventListener class. These are easily wired up in the DatabaseConfiguration.cs file.

I then created a simple class with no methods called IEntityToAudit. This is used to mark which specific data classes I want to audit.

In my AuditEventListener, I can tell if the class that fired the event is marked for audit by simply trying to cast it to the IEntityToAudit interface class. If the cast comes back not-null, I know that class is flagged to be audited.

var entityToAudit = @event.Entity as IEntityToAudit;

 

 if (entityToAudit != null)

 {

   

Now it’s a simple matter of identifying the class being audited so I can record the name. And nHibernate makes it easy to tell what properties of the class are dirty.
 
This way the logging accomplishes two goals:

1. Only logging what properties were changed

2. Logging both the old value and new value of each changed property

 
I built a new data class to store the log entries in a new table in SQL. That made it easy to create a custom report in admin to view the audit log. I’m also going to build out a new button on the edit-category and edit-product pages. This will permit the admin to quickly view the audit log for a given entity.
 
Entities that are deleted record both the entityId and the value of the Name property (if that property exists).
 
To add logging to any entity in Able, I simply modify that <entity>.cs file and change 1 line of code to make it implement the IEntityToAudit interface. Everything else is handled automatically and outside of Able code to minimum customization of existing files.
 
Overall, it was a fun learning experience into the depths of nHibernate and the AbleCommerce domain model implementation.

How to speed up site rebuild after compiling a DLL

Came across this little gem this morning.   Made a significant difference on my PC.

Here’s the best one. Add this to your web.config for MUCH faster compilation.

<compilation optimizeCompilations="true">

Quick summary: we are introducing a new optimizeCompilations switch in ASP.NET that can greatly improve the compilation speed in some scenarios. There are some catches, so read on for more details. This switch is currently available as a QFE for 3.5SP1, and will be part of VS 2010.

The ASP.NET compilation system takes a very conservative approach which causes it to wipe out any previous work that it has done any time a ‘top level’ file changes. ‘Top level’ files include anything in bin and App_Code, as well as global.asax. While this works fine for small apps, it becomes nearly unusable for very large apps. E.g. a customer was running into a case where it was taking 10 minutes to refresh a page after making any change to a ‘bin’ assembly.

To ease the pain, we added an ‘optimized’ compilation mode which takes a much less conservative approach to recompilation.

Hosting Classic ASP on Server 2008 R2

Ran into a problem today trying to light up a Classic ASP site that was a copy of an existing site on the same server.   However HTTP 500 errors were all we could get.   Once we enabled debug logging in ASP, we saw that there was a problem with the ADODB connection being able to open the MSAccess database within the site folders.

 

Gotta remember to enable 32-bit apps in the application pool advanced settings.

 

Note: Microsoft Access databases have been popular for many years with developers who use Active Server Pages (ASP) for small-scale applications, but Microsoft Access databases are not designed for scalability, therefore Access databases should only be used where performance is not a factor, and it is best not to host large-scale data-driven applications with Microsoft Access databases.

In IIS 7.0, IIS 7.5, and above, several security changes were made that may affect how classic ASP applications will function. For example, if you were to copy a classic ASP application that uses an Access database that is within the Web site’s content area to a server that uses IIS 7.0 or above, you may receive the following error message:

Microsoft JET Database Engine error ‘80004005’
Unspecified error.
/example.asp, line 100

This is a generic error triggered by the Access driver that may occur for a variety of reasons, but incorrect permissions is a common cause. More specifically, the ability to work with Microsoft Access databases is implemented through the Microsoft JET Database Engine, which creates various temporary and lock files when it connects to an Access database. The following sections will discuss some of the reasons why this may occur and how to resolve those situations.

Working with 64-bit Systems

Unfortunately there are no 64-bit ODBC drivers, so on 64-bit systems you will have to run your applications in 32-bit mode. To do so, use the following steps:

  1. On the taskbar, click Start, point to Administrative Tools, and then click Internet Information Services (IIS) Manager.
  2. In the Connections pane, click Application Pools.
  3. Highlight the application pool for your application, then click Advanced Settings… in the Actions pane.
  4. In the Advanced Settings dialog, specify True for Enable 32-Bit Applications.
  5. Click OK to close the Advanced Settings dialog.

TortoiseSVN missing icons in Windows 7

Been scratching me head for a while now as to why my Windows Explorer icons for TortoiseSVN were suddenly missing.

After some digging, it appears that DropBox updated itself and added some more shell overlay extensions.   Since Windows 7 only supports 11 overlays, this pushed the Tortoise icon entries too low in the list.

The solution was simple:  edit the registry hive: HKEY_LOCAL_MACHINESOFTWAREMicrosoftWindowsCurrentVersionExplorerShellIconOverlayIdentifiers

and remove the unwanted registry entries so that the TortoiseSVN entries are in the first 11 found.

 

SQL 2012 Management Studio hangs clicking Files menu during database restore

I’ve had this nagging issue for months now.   Every time I go to restore a database from a backup file, I cannot click the Files menu in the left sidebar.   This didn’t happen in SQL Server Management Studio 2008.    But it happens every time in SQL Server Management Studio 2012.

I enjoy using SSMS, it works well for my needs.  So it got very frustrating that it would lock up when I needed it most.

I finally took the time to research and thought maybe a newer version had been released since I downloaded it.  So installed the SQL Server 2012 Service Pack 2 update and sure enough, it fixed the problem.

The download link for the Service Pack is http://www.microsoft.com/en-us/download/confirmation.aspx?id=43340

AbleCommerce Gold How To Clean Up Anonymous Users

Even though Able Gold has a manual cleanup option in the Maintenance page, it doesn’t always work well.  The problem arises from how Able must delete each user individually.  Not such a big deal when you have 500 users to clear out.  

It’s a very different story when you have 2,000,000 unwanted users.   A SQL query can delete all of the unwanted records in a single command.

Below is the updated query to work with Able Gold schema.  Obviously change the dates to something more recent. 

If the queries do not remove as many records as you expected, you might want to remove the “AND (AffiliateId IS NOT NULL)” criteria if you don’t care about affiliate reporting.

use <yourdbname>

DELETE FROM ac_Baskets 

WHERE UserId IN (SELECT UserId FROM ac_Users WHERE StoreId = 1 AND IsAnonymous = 1 

AND (AffiliateId IS NOT NULL ) 

AND (LastActivityDate IS NULL OR LastActivityDate <'June 30, 2009'))

 

DELETE FROM ac_Wishlists

WHERE UserId IN (SELECT UserId FROM ac_Users WHERE StoreId = 1 AND IsAnonymous = 1 

AND (AffiliateId IS NOT NULL ) 

AND (LastActivityDate IS NULL OR LastActivityDate <'June 30, 2009'))

 

DELETE FROM ac_Users 

WHERE StoreId = 1 AND IsAnonymous = 1 

AND (AffiliateId IS NOT NULL ) 

AND (LastActivityDate IS NULL OR LastActivityDate <'June 30, 2009')

SQL Studio Management Expand Databases Slow

So on my development PC, I’ve noticed that it takes 2-3 minutes for the Databases node to expand in my SQL Server Studio Management 11.0.2100.60

I don’t remember exactly when it started.   But it’s been several months I think.  It was just too easy to ignore since I could alt-tab and do something else while it worked through the delay.

Today I decided to research it and determined it’s because some databases have auto_close turned on.   This causes the SQL server to have to start up the database before it can render it in the Databases node.    This creates the significant delay when you have several databases configured for auto_close.

A quick way to fix all of them at once time AND set recovery mode = simple (anything more is useless in my dev environment), use this query:

USE MASTER

declare

    @isql varchar(2000),

    @dbname varchar(64)

    

    declare c1 cursor for select name from master..sysdatabases where name not in ('master','model','msdb','tempdb')

    open c1

    fetch next from c1 into @dbname

    While @@fetch_status <> -1

        begin

        select @isql = 'ALTER DATABASE @dbname SET AUTO_CLOSE OFF'

        select @isql = replace(@isql,'@dbname',@dbname)

        print @isql

        exec(@isql)

        select @isql = 'ALTER DATABASE @dbname SET RECOVERY SIMPLE'

        select @isql = replace(@isql,'@dbname',@dbname)

        print @isql

        exec(@isql)

        select @isql='USE @dbname checkpoint'

        select @isql = replace(@isql,'@dbname',@dbname)

        print @isql

        exec(@isql)

        

        fetch next from c1 into @dbname

        end

    close c1

    deallocate c1