QuantumView API with AbleCommerce

Recently a client requested a challenging project.  They wanted to fully automate the processing of tracking numbers pulled from the QuantumView API data feeds.

The majority of the client’s shipments are drop-shipped direct from the manufacturer or distributor who charge my client’s UPS account number.   This is a technique known as 3rd-party billing.  This works well because my client can leverage all of their shipping volume with UPS instead of just the in-stock shipments.   And they know what to expect for shipping charges since it’s their account being billed.

However there is a downside to this shipping process.  Normally an integration like ShipStation or WorldShip would post tracking numbers back to AbleCommerce.  However with drop-ship orders, these tracking numbers must all be hand-entered on every shipment.   A major hassle when your order volume is significant.

The client’s workaround was to create two feeds in their QuantumView account.  One feed for in-stock shipments and the other for all 3rd-party (dropship) shipments.  Each day the client would download the feed, load it into Excel for readability purposes and then copy/paste tracking number data.   A huge pain.

The design for this integration needed to meet the following requirements:

  • Download the feed files on a regular basis from the UPS QuantumView API
  • Locate the Origin record types and identify the shipment Id for each one
  • Post the associated tracking number to the shipment Id
  • Set aside any Origin records that could not be automatically processed
  • Provide a UI for manually processing Origin records which could not be automatically processed
  • Log a summary of each interaction with the QuantumView API for debugging purposes

The first step to the project was to dig into the UPS QuantumView API documentation.  The docs clearly show the API supports both a JSON and XML payload.  However I was never able to get the JSON payload to work properly.  The problem came down to arrays/collections.  When a specific element in the API specifications supported multiple items, I would naturally define it as an array i.e. element[].  But when the element had only one entry, UPS would not send the element as an array with a single entry.  UPS would simply send a standard element definition.   This broke my class structure in C# – I couldn’t get JSON.Net to handle a element that could arrive as a single piece or as an array of pieces.

After several frustrating hours I was forced to give up and switch to the XML format of the API.  The documentation from UPS for the XML API was vastly more detailed.  Clearly it seems they added JSON as an afterthought.

I wasn’t looking forward to the prospect of hand-building a class that matched the XML structure of the API response.  Fortunately, I found an awesome trick that involved using a command-line tool from Visual Studio.  Just hand the XML file to the tool and *poof* it generates the necessary class and child classes to consume that XML layout.  Later on I found that feature actually exists right in Visual Studio – just copy the XML to your clipboard and then use Edit/Paste Special/Paste XML as Classes.  SWEET!

Once I had the XML importing into strongly typed classes, progress moved much faster.  I was able to leverage HangFire to automatically poll the UPS API on schedule controlled through an admin settings page.   Now that automation is handled, I moved on to processing the downloads.

Testing the API proved a bit confusing because of one REALLY important detail:  The default API endpoints only deliver what transactions have been received by UPS since the last time you asked UPS for them.  Helpful in a production scenario since you never have to worry if you’ve already received a specific Origin record.  But it makes testing your code a major source of stress.  Your first request comes back with data, and then suddenly you get nothing.  Over and over again, what you thought was working now appears to be broken.  The way around this is to add a date-range criteria (up to 7 days in the past).  When a date range is included in the request, QuantumView will respond with all transactions regardless if they’ve already been delivered or not.

Now that I’m pulling down data, the processing became simple.  The client was already having each manufacturer put the AbleCommerce Shipment Id in one of the UPS reference fields.  UPS manifests support up to five ‘reference’ fields usable by the shipper for any purpose.  Having the Able Shipment Id in each Origin record makes it a snap to determine which Able shipment gets the tracking number.

 

Leveraging the Gift Wrap feature for optional upcharges to a product

Just goes to show, you still can teach an old dog new tricks.

A client needed a way to add an Engraving choice to the product page.   Normally this isn’t an issue, just use the product variant feature.

But in this case, the engraving choice must be optional.   And it has to charge an amount.  And it needs to accept some text for the actual engraved words.

So after a little digging, I settled on a little-used feature in AbleCommerce known as Gift Wrap.    Gift Wrap is a way to associate a secondary charge to any product.   The basket page and the checkout pages automatically know how to handle Gift Wrap. 

By default, AbleCommerce handles Gift Wrap selection during checkout.   So the first modification was to get gift wrap to apply on the product page.   This means the BuyProductDialog control is the place to start.    I used a checkbox to render a separate panel that included the text box control.    The contents of the text box control are set to the giftmessage property available on each line item object.

Then it’s just a matter of setting the correct Id for the gift wrap on the line item before it’s added to the basket.

Basket page needed some text changes to replace “Gift Wrap” with “Engraving”…super easy to do.

Checkout needed modified to skip the gift wrap page normally encountered by default AbleCommerce behavior.

At that point, I’m done – the client can now easily associate an optional upcharge for Engraving on specific products within the store catalog. 

Building an Audit Log with nHibernate Listeners in AbleCommerce

I have a client who can’t seem to figure out who’s making edits to certain products. Somebody changed the price, somebody marked it hidden. That sort of thing. Too many people in the back-end and no audit trail for who changed what.

So I decided to dig into nHibernate listeners. After a brutal all-nighter reading StackOverflow posts and copying snippets from a few blog posts, I actually got it working.

I added some new handlers for the Post-Commit-Create, Post-Commit-Update and Post-Commit-Delete listener events and pointed them to my AuditEventListener class. These are easily wired up in the DatabaseConfiguration.cs file.

I then created a simple class with no methods called IEntityToAudit. This is used to mark which specific data classes I want to audit.

In my AuditEventListener, I can tell if the class that fired the event is marked for audit by simply trying to cast it to the IEntityToAudit interface class. If the cast comes back not-null, I know that class is flagged to be audited.

var entityToAudit = @event.Entity as IEntityToAudit;

 

 if (entityToAudit != null)

 {

   

Now it’s a simple matter of identifying the class being audited so I can record the name. And nHibernate makes it easy to tell what properties of the class are dirty.
 
This way the logging accomplishes two goals:

1. Only logging what properties were changed

2. Logging both the old value and new value of each changed property

 
I built a new data class to store the log entries in a new table in SQL. That made it easy to create a custom report in admin to view the audit log. I’m also going to build out a new button on the edit-category and edit-product pages. This will permit the admin to quickly view the audit log for a given entity.
 
Entities that are deleted record both the entityId and the value of the Name property (if that property exists).
 
To add logging to any entity in Able, I simply modify that <entity>.cs file and change 1 line of code to make it implement the IEntityToAudit interface. Everything else is handled automatically and outside of Able code to minimum customization of existing files.
 
Overall, it was a fun learning experience into the depths of nHibernate and the AbleCommerce domain model implementation.

How to speed up site rebuild after compiling a DLL

Came across this little gem this morning.   Made a significant difference on my PC.

Here’s the best one. Add this to your web.config for MUCH faster compilation.

<compilation optimizeCompilations="true">

Quick summary: we are introducing a new optimizeCompilations switch in ASP.NET that can greatly improve the compilation speed in some scenarios. There are some catches, so read on for more details. This switch is currently available as a QFE for 3.5SP1, and will be part of VS 2010.

The ASP.NET compilation system takes a very conservative approach which causes it to wipe out any previous work that it has done any time a ‘top level’ file changes. ‘Top level’ files include anything in bin and App_Code, as well as global.asax. While this works fine for small apps, it becomes nearly unusable for very large apps. E.g. a customer was running into a case where it was taking 10 minutes to refresh a page after making any change to a ‘bin’ assembly.

To ease the pain, we added an ‘optimized’ compilation mode which takes a much less conservative approach to recompilation.

AbleCommerce Gold How To Clean Up Anonymous Users

Even though Able Gold has a manual cleanup option in the Maintenance page, it doesn’t always work well.  The problem arises from how Able must delete each user individually.  Not such a big deal when you have 500 users to clear out.  

It’s a very different story when you have 2,000,000 unwanted users.   A SQL query can delete all of the unwanted records in a single command.

Below is the updated query to work with Able Gold schema.  Obviously change the dates to something more recent. 

If the queries do not remove as many records as you expected, you might want to remove the “AND (AffiliateId IS NOT NULL)” criteria if you don’t care about affiliate reporting.

use <yourdbname>

DELETE FROM ac_Baskets 

WHERE UserId IN (SELECT UserId FROM ac_Users WHERE StoreId = 1 AND IsAnonymous = 1 

AND (AffiliateId IS NOT NULL ) 

AND (LastActivityDate IS NULL OR LastActivityDate <'June 30, 2009'))

 

DELETE FROM ac_Wishlists

WHERE UserId IN (SELECT UserId FROM ac_Users WHERE StoreId = 1 AND IsAnonymous = 1 

AND (AffiliateId IS NOT NULL ) 

AND (LastActivityDate IS NULL OR LastActivityDate <'June 30, 2009'))

 

DELETE FROM ac_Users 

WHERE StoreId = 1 AND IsAnonymous = 1 

AND (AffiliateId IS NOT NULL ) 

AND (LastActivityDate IS NULL OR LastActivityDate <'June 30, 2009')

NHibernate sub query on Orders filtering by usergroup

A client had a need to filter the Monthly Sales Summary report by group membership.  In other words, restrict the totals to orders placed by users only in a specific user group.

The initial NHibernate query is such:

ICriteria criteria = NHibernateHelper.CreateCriteria<CommerceBuilder.Orders.Order>("O")

    .CreateCriteria("Items", "OI", NHibernate.SqlCommand.JoinType.InnerJoin)

    .Add(Restrictions.Eq("O.Store", AbleContext.Current.Store));

 

So I added a dropdown that is populated by GroupDataSource.LoadAll().  Then it seemed easy enough to just add this code which I found in the UserSearchCriteria.cs file:

if (groupId >= 0)

{

    criteria.CreateCriteria("O.User.UserGroups", "UG", NHibernate.SqlCommand.JoinType.InnerJoin)

    .Add(Restrictions.Eq("UG.Id.Group.Id", this.GroupId));

}

The problem is, this code throws a big ol’ NHibernate error “nhibernate multi-part identifier <someobject> could not be bound”

For some reason, NHibernate couldn’t resolve the relationship of Order –> User –> UserGroups –> Group.   The code I swiped from UserSearchCriteria had no problem with it.  But here it just wouldn’t work.

I finally figured out the solution was to create a new reference to the User table and base the criteria from there.   So instead of starting at the Order object level, the join and restriction starts as the User object level which sort of makes sense now that I’m typing this…

if (groupId >= 0)

{

    criteria.CreateCriteria("O.User", "U", NHibernate.SqlCommand.JoinType.InnerJoin)

        .CreateCriteria("U.UserGroups", "UG")

        .Add(Restrictions.Eq("UG.Id.Group.Id", groupId));

}

Finally, the end result is the NHibernate query now properly searches the orders for only those records where the user that placed the order is a member of a specific GroupId.

AbleCommerce Gold Upgrade errors from 7.0.7

Ran into an odd timeout issue today.  It was disguising itself as an nHibernate problem during the upgrade of an AbleCommerce 7.0.7 store to AbleCommerce Gold R6.

 

The initial error was complaining about nHibernate null Id issues with Catalog.Webpage.  Which made no sense since the upgrade is converting the data, not adding to it.

So I dug into the /install/upgrade.aspx page and saw that indeed new web pages were being added.  And the error was crashing when store.settings.Save() was being called.

I eventually noticed an “errorList” string array variable and exposed it via debug.   It showed me that the SQL upgrade scripts were causing a timeout.   This was silently crashing the upgrade.

After a quick Google search, I found the solution.  In the RunScript() routine within Upgrade.aspx.cs, you have to increase the command timeout value.   Apparently the database I was upgrading was enormous and my local SQL server isn’t really known for speed.  Combine the two factors and you can easily result in timeouts.

Note the addition of setting the CommandTimeout parameter below:

try

{

    SqlCommand command = new SqlCommand(sql, conn);

    command.CommandTimeout = 300;

    command.ExecuteNonQuery();

}

ABF shipping gateway base written and working

Spent some time this morning and put together the complete shell for an ABF shipping gateway in AbleCommerce Gold.    It’s only purpose is to provide tracking URL support for ABF shipments when marking orders as shipped.

It was interesting how Able has implemented a default provider interface, leveraged embedded resources and connected it all to specific configure and register pages in the admin.  I’ve worked on various shipping gateway pieces several times, but never a brand new AbleCommerce Gold gateway implementation.   Kinda slick how they did it.

I have to see if ABF will approve my request for an account.  It’s needed to access their API.   If I can get API access, I can build a fully functional gateway.

Now that the learning curve is over, it will be far easier for me to build new shipping gateways in AbleCommerce Gold.

Problem with Jr. Admin security permissions in upgraded AbleCommerce 7 sites

As of Gold R6…

After upgrading an AbleCommerce 7.x website to AbleCommerce Gold, you’ll find the Junior Admin permissions do not work as expected.

In the Old Able 7.x, the role name was “Jr. Admin”.   However in Able Gold, the role name was changed to “Junior Admin”.  This value is hard-coded in various web.config files as well as the /app_data/adminmenu.xml file.

If you upgrade Able 7.x to AbleCommerce Gold, the value does not get updated.  As a result, the new Able Gold install cannot accurately identify a user as an admin user if they are a member of only the Junior Admins group assigned to the Junior Admin role.

The fix is simple.   Open the ac_Roles table in the database and replace the “Jr. Admin” value with “Junior Admin”.   Do the same for the lower case value as well.

How to remove full text catalog index from SQL 2005 database

When moving an older AbleCommerce 7 database from SQL 2005 to a newer SQL server, you might run into errors with the ac_searchcatalog full text index catalog file.   SQL 2008 doesn’t store FTS the same way it was done in SQL 2005.

The issue usually appears when you try to restore the database backup from 2005 to 2008 or greater.

To fix it, you have to remove the FTS file associated with the database.   But it’s not part of the SQL backup, so you have a problem if you didn’t copy the index file separately.   Here’s how to get rid of the index file reference.

Go into the old SQL server and run as follows:

USE <dbname>

SELECT name, ftcatid FROM sysobjects WHERE ftcatid > 0

For each returning table name (which are the tables with Full-Text Index), run this command:
EXEC sp_fulltext_table ‘tblName’, ‘drop’

After all table index references are dropped, get rid of the Full-Text Catalog with:
DROP FULLTEXT CATALOG catalogName

After this is done, don’t forget to remove the file by right-clicking the database, choosing Properties, and then click Files.