Case Studies

Interesting projects we have worked on.



C#, MVC, .NET, IIS, Microsoft SQL Server, Bootstrap, JQuery


We were approached by a client to create a line of business application for recording onsite asbestos inspection data. This application would be available utilizing a responsive web page that could be displayed on browsers, tablet, and mobile devices with responsive styling. The application records information on customers, locations, buildings, rooms, building materials, samples, and other information.

Project Structure:

Logins are associated with Organizations which allows multiple users to work on the same project

Organizations are the top level object for data. Customers, Projects, etc are associated to an organization. The structure of the objects is shown in a parent child level format below.

  • Organizations
    • Customers
      • Projects
        • Reports
        • Uploads
        • Rooms
        • Suspect Materials
          • Suspect Material Associations
            • Samples
    • Lookups
  • Default Lookup Data

There is a default set of lookup data for suspect material attributes, years, and various other data fields.


Step 1: Create a Customer
Description: The user creates an entry for a customer that contains the customer name and other details

Step 2: Create a Project
Description: The user creates an entry for a project that contains the project name and other details

Step 3: Add Rooms
Description: The user will walk through the building being inspected. As they survey the property they create rooms for each area inspected.

Step 4: Add Materials
Description: As rooms are added the user will review the room for building materials used. These may be materials on the floor, walls, and roof. Once a material is added it can be associated to a room

Step 5: Associate Rooms and Materials
Description: Select which rooms contain which materials. i.e. Room 1 contains White Floor Tiles

Step 6: Add Samples to Room and Material
Description: Each Room and Material combination can have a sample added to it for testing

Step 7: Add Sample Data
Description: The samples are analyzed and a result is entered for each sample. This result determines whether the sample and material is listed as tested positive or negative.

Step 8: Generate Report
Description: A report is generated based upon the data which shows summary and detail pages for the building, rooms, materials, and samples.


At the outset of the project we knew we would be importing a variety of text data in the form of the materials, categories, material properties, room properties, and other data. In order to accommodate this variety of properties and allow the users to add there own as needed we implemented a flexible lookup table in our database. This table allow us to insert entries for a category and child entries for that category into the same table utilizing parent and child id’s. Each entry could also be associated to a specific user to allow each user to have there own set of possible selections in drop down lists.

In order to generate professional and consistent formatting on our reports we utilized a custom implementation of a c# PDF library to create and combine report pages utilizing MVC View Models and rendering the views into pdf documents which would then combine into a single report with page numbering.

Final Completion:

The project was delivered on time allowing our client to perform inspections and deliver reports to customers in a quarter of the time previously taken. The project is currently marketed to other inspection organizations. Reports may be purchased on a per project basis for a small report generation fee.

Vehicle Servicing iPad Application


AngularJS, Apache Cordova, NPM, Jquery, Bootstrap, SQLite, ASP, .NET, C#, MVC, SQL Server, OSX, IOS, XCode


We received a request to build an offline capable iPad application to assist with reporting of service for vehicles for oil related field equipment. The application would need to utilize data from an existing ASP Web Forms application backed by a SQL Server Database. The data would need to be synced to an iPad based upon the technician that would be servicing the vehicles in the field.

This data would contain work order, vehicle, part, make, and models for the sites that technician would visit as well as allow the technician to add additional data in the field if necessary. Once the technician completed services and they had a network connection they could sync this data back to the SQL Server and the data could be approved and/or changed utilizing the client existing application and an additional approval module that was written specifically to simplify to approval process.


The data required to allow offline adding and editing of vehicle service information was spread across 12-15 tables in the SQL database. This data had a variety of complex relationships which we had to maintain in the iPad application even in an offline state. In addition the data required was of a significant size and we needed a sizable and expandable database to hold the offline data between syncs. In order to accommodate this we used a local SQLite database on the Ipad to hold the data which allowed us to maintain greater control over the size and location of the database.

The technician servicing screen on the iPad required a set of 4 cascading drop down lists which were interdependent and interacted with the other drop down lists upon selection. In order to accomplish this we implemented a custom angular directive with multiple interactive watches that reached to user selection changes on the fly and saved data to a temporary table to maintain the state of the application as the user changed data entry on the services performed for a vehicle.

Upon reaching a point where the iPad needed to be synced to the server we had to implement a complex procedure to compare the data from the iPad to existing data in the sql server to prevent duplication and add new items if necessary. New items then went into an approval process which allowed the office staff at the client staff the opportunity to correct new item data before it was put into production use.

Final Completion:

We delivered a functional iPad application which allowed multiple technicians to travel into disconnected areas, perform a variety of service procedures on existing and new equipement; then synchronize that data to an existing application allowing office staff to bill and invoice much more quickly and efficiently.

Insurance and Benefits Processor


ASP.NET Web Forms, VB.Net, IIS, Microsoft Sql Server, Microsoft Excel, Quickbooks


Create an application to track employee enrollment in benefit plans, and match corresponding monthly remittance amounts to the employee enrollments for multiple contracted Employers, and generate reports in Excel format.


We were approached by “Client A” to create an application to track employee enrollment in benefit plans, and match corresponding monthly remittance amounts to the employee enrollments for multiple contracted Employers. Additionally, the application would allow users to document the difference between enrollments and remittances per employee and generate Excel files documenting the amount remitted by each employee to be sent to the insurance vendors along with the remittance check. A further requirement was the generation of Quickbooks .iif files documenting contributions to healthcare savings accounts.

This would be an ASP.Net Web Forms application with a Microsoft Sql Server back end. The employee enrollments are created by importing Excel files that list each employee and their dependents, and the benefits for each. The enrollments are automatically advanced every period so that they are automatically kept current. The periodic remittances are imported as Excel files as well, and occur once for each contracted Employer per month.


One of the main challenges we faced was designing a database structure that enabled the application to automatically advance the employee enrollments from one period to the next and was flexible enough to handle multiple payment period scenarios (monthly, weekly, bi-weekly, semi-monthly, etc). We met this challenge by creating a system that kept track of start and end dates for each period, and scripting the creation of these periods to meet the requirements of the payment scenario.

Equally challenging was addressing every remittance resolution scenario (refund, short, over, perfect) and creating Excel reports with line items that matched to the penny when dealing with report totals in the 6 figures.

Final Completion:

Upon final delivery of our solution we had a intranet-based web application that enables our client to document the enrollments and remittances of tens of thousands of employees across dozens of companies and generate reports that match to the penny across millions of dollars.

Oil Industry Electronic Invoicing


ASP.NET MVC, Web API, Windows Services, C#, IIS, Microsoft Sql Server, RosettaNet Information Framework (RNIF), Petroleum Industry Data Exchange (PIDX), XML


Create an application to consume invoices exported by a customer’s system as csv files and process them into PIDX XML files and POST them along with their supporting documentation files via the RosettaNet Framework to a remote endpoint.


We were approached by our client, a supplier to oilfield companies, to create an application that would allow them to send their invoices, along with the supporting documents as attachments, electronically to a 3rd party administrator. The application would have to monitor directories on a server file system and when a new invoice is created automatically submit the invoice to the 3rd party administrator. The invoices are exported as flat data in a csv format. The application would have to process these invoices into PIDX compliant XML files, then include those XML files and their attachments as parts of a multipart/related POST to the remote endpoint.

Additionally, an intranet-based application was needed to maintain the customer data. A further requirement was a publicly available, SSL protected endpoint to accept the Receipt Acknowledgment from the 3rd party administrator's system.


The main challenge when creating this system was the sheer number of different technologies we needed to integrate in order to create one coherent system. There is a Windows service that monitors the file system for new invoices and does the heavy lifting of processing the files and POSTing them to the remote endpoint. In order to successfully process the csv files into XML files we had to obtain the proper schema definitions, convert them to C# classes, populate the C# classes with invoice data at runtime, and serialize the C# classes into XML files . There is also an MVC C#/SQL Server application that is used to maintain the customer data and accept the Receipt Acknowledgements. The Windows Service also has to communicate with the Sql Server database for customer specific information that needs to be included in the XML files.. We also had to create a Web API project for testing our POSTs and the Receipt Acknowledgements. Couple that with the need to precisely implement the RNIF and PIDX specifications and this was indeed a challenging application to develop.

Final Completion:

Upon final delivery of our solution we provided our client with a system that automatically submits their invoices and supporting documentation to their 3rd party processor, receives the Receipt Acknowledgments, and logs all the associated actions. Our client’s invoices are paid more quickly and accurately, and as an added bonus they no longer have to manually add the supporting documentation to their invoices, saving them at least 10 minutes for each invoice.

Sema Data Co-Op


Wordpress, MySQL Server, WooCommerce, PHP


We were approached by “Client A” to create a project to create an automated csv and zip file import process for product data. This import process would integrate as a WordPress Plugin into an existing WooCommerce store. This import procedure could be initiated manually by uploading a csv file or by scanning a specific folder for csv files. The product data would be imported via CSV files which would be automatically uploaded to the clients local server via FTP transfer.

Upon successful FTP transfer our web application would scan the FTP folder for csv and/or zip files and perform the import procedure. The import procedure would insert the product data into an existing woocommerce database structure which would create published product listing pages for successful imports.

Manual Process:
Step 1: Upload CSV File with Headers for Import Data
Description: User selects a CSV file for import in order to establish mapping of columns to WooCommerce fields
Response: Plugin displays a list of WooCommerce fields in one column. In a second column drop down lists are displayed for all headers found in the CSV File.

Step 2: Map column headers to WooCommerce Fields
Description: User selects the column they wish to map to a WooCommerce field. i.e. CSV_File_Advertised_Price mapped to WooCommerce regular_price

Step 3: Upload Zip file with Images associated with the previous CSV File
Description: User selects the zip file containing images for the products contained in the CSV file selected in step 1. Products are matched by SKU numbers in the image filename and CSV File row

Step 4: Product Import Complete
Description: The product import process occurs in around 5-30 seconds and the user is presented with a success or error message

Automated Process:
Step 1: Setup FTP Settings
Description: User goes to the plugin settings page and enter the ftp server, login username, and password.

Step 2: Setup automated FTP uploads
Description: User goes to the data delivery portal they use and sets up automated uploads of product data to the server.

Step 3: Scan and Import
Description: The plugin performs scans every 5 minutes. Upon finding a valid CSV file it imports the product data and scans the image folder for matching images to associate to the products. Upon finding a valid zip file it extracts the images to the image directory and associates the images to products based on the SKU number in the filename.


In the process of implementing the solution we quickly realized that the import procedure would be severely bottlenecked by processing the files using PHP to read each row in the CSV file. Some files would have over 20,000 rows which could take 4-5 minutes to perform reading and importing often with even slower response times. In order to address this issue we utilized a large database import of the csv data into a custom MySQL database structure and performed our data merge by utilized stored procedures to insert the product data in large batches. This decreaed import times to below 1 minute even in the case of very large files.

In some cases csv and zip files would fail during FTP transfer leaving a partial corrupted file in the FTP folder. In order to address this we implemented a try catch style of scanning the directory. We would attempt to import the csv or extract the zip file. If the import was successful we would temporarily store the successful imports in a last import directory. If the import or extraction met with issues we placed the files in questions into an failed to import file folder and emailed a message to the site administrator so they could review the files themselves or reattempt the ftp transfer from their data provider.

Final Completion: Upon final delivery of our solution we had a wordpress plugin that successfully import tens of thousands of products into a WooCommerce store allowing the site owner to save hours of time creating products manually. In addition updates to pricing, product data, and images became an automated process so the site owner could focus on store management and sales rather than data management.