The following are stories about real world projects and solutions that I have produced.
In 2014 I was approached by the Veggie Co-Op to create a solution to replace their paper-based order system.
In a couple of weeks I produced a simple system to manage customers, orders and accounts
- a flexibile system with professional accountability.
The customer is so happy with the result that they’d like to market this to their peers:
In 2010 I created a Test Farm to coordinate the running of automated test
scripts across many machines. This consists of a server, which directs the
clients as to what script to run next. Upon completion of a script, the results
are reported back to the server. This enabled my client to maximize test
throughput across the available test machines, allowing them to run their whole
test suite within 24 hours instead of a few weeks:
One cool funky bit was the screenshotting of the client machine while running scripts – so we can see any that might be
hanging from a central operator console.
These screenshots are updated every 10 seconds:
When a test crashes, a screenshot is taken before resetting the machine ready for the next script.
In 2014 I extended this to bisect changes between a script's failure and the last known pass to find the exact change that introduced the failure. This vastly reduced the time to analyse the cause of failures.
I have extensive experience treating source code as data which can be analysed to produce new insights into code relationships, metrics and vulnerabilities. I have applied this experience to automate code changes.
In 2000 I wrote my first code automation.
This tool searched for and replaced antiquated coding idioms and known performance problems.
I applied 300,000 changes to 2.3 million lines of code in one day without disrupting active production.
Later I created a full VB6 parser which allowed us to explore the code more deeply and apply more advanced transformations. This was used to remove 80 core shared source files from around 240 projects into a separate library. Each call which would resolve into one of these core files was automatically updated. I later developed a strategy for upgrading this large VB6 code base to VB.NET and played a major role in migrating a Delphi server to .NET, including a code clean-up transformation and guiding the re-architecture of the newly transformed code.
In 2013 I undertook a few projects to automate architectural changes to a large codebase. I replaced service classes with interfaces, allowing for the implementation of mocking for unit testing. I automated the resolution of about 100,000 issues reported by ReSharper, most challenging of these was the correction of about 40,000 identifier names within the codebase.
In 2014 I helped migrate a large windows application to the web. I was to automate as much of this as possible. The plan was to observe the running of the UI scripts and produce working web service code and tests as well as a new working web client. This automation drastically reduced the potential 100,000 hours of manual effort to rewrite the user interface.
In 2012, I co-architected the transformation of 2.3 million lines of VB6 applications to C# libraries to provide
the back-end for a new Web based UI and mentored a team in China to outsource some of this work.
Through my experience, I have become recognized as a national expert in VB6 code migration and have run courses on this for Microsoft.
I have applied my experience in converting VB6 to .NET to migrate VBA-like Rational Robot and Test Partner UI test scripts to C#. This allows everyone access to running test scripts and they can be folded into developer-oriented test infrastructure such as MSTest and NUnit, thus consolidating the tool-set used across the company.
In 2012 I converted 1.5 million lines of Rational Robot UI test scripts to C#. I established a plan and trained a QA team to replace UI-interactive code with API calls to C# code as components were converted from VB6. This enabled a Test Driven Development approach, harnessing 14 years of business knowledge to mature source code being converted from VB6 to C#. The final result produced a suite of tests that execute in 5% of the time with no UI interaction.
In 2013 I converted 1 million lines of Test Partner test scripts to C#, freeing my client of exorbitant maintenance costs for a discontinued product.
For some years, I had primary responsibility for architecting and maintaining an automated build process and developing associated tools. At the core of this build process was a database, which contained project-source relationships, change details, source code and for VB6, a full call graph. This allows the delivery of hotfixes that include only those applications that call the affected procedures – not those with changed source files. A front end was initially developed in ASP, but later re-written as a rich-client application. This application allows the user to search and browse relationships in the build database, with each result page showing various possible useful details. There are also some specialist pages such as a branch timeline. This graphical tool details every change and release in every branch, allowing highlighting of a certain check-in as it progresses through the branches. Another page shows the delivery time-line in relation to changes. Double-clicking on a check-in takes the user to a page with details of the check-in. Another tool allows the browsing of the procedure call graph. This was developed mainly for prototyping a Hotfix & Service Packs project for delivering calculated patches. It is useful for exploring procedure relationships.
For one client I established foundational DevOps architecture, embracing tools familiar to them. I formalized and augmented their existing processes into a cohesive high-level DevOps infrastructure.
My work culminated in this Web Site to automatically document in real time what version is where:
When I started, only 4 of these products had complete CI and most personnel were unaware of most of these components.
The build process was based on Visual Source Safe, but had so much built around it that even Microsoft reviewers commented that it's state of the art and goes beyond their own processes in some regards. We did not move away from VSS because other systems would not add any real functional value, which would warrent the investment in rearchitecting the build process. However when Team System 2010 was released and was licenced with MSDN subscriptions, I architected the migration of our VSS databases from the past five years into Team System, replicating the branching etc. in a proof of concept that it we'd been using Team System for the past five years, this is what it would look like and perform like. I also rearchitected the build processes and tools, taking advantage of Team Systems features such as rich branching and merging, Team Build, gated check-ins (check-ins are not complete until the build + tests succeed and failure rejects the check-in), check-in constraints etc.. I also architected a migration path for the aging issue management system, that would synchronize into Team system until we were ready to abandon the old system.
In 2010, I visited a client in London for two weeks. I upgraded one of their solutions, which included a couple of simple VB6 projects, to Visual Basic 2010. I made the projects adhere to FX-cop's coding standards and implemented a unit test to demonstrate how this can be used. I introduced an automated build process, which would log build breakages and coding standards violations. This enabled them to hand over their source to a development unit in India, with the unit tests demonstrating how the solution works. The requirement could be set that the build is perfect and the unit tests still work when their work is complete. The new functionality must be covered by further unit tests. I trained them in the changes to the development environment since Visual Basic 2003 and coached them in modern OO engineering practices, and how to resolve coding standards violations. The project manager was quite blown away with what I achieved in two weeks.
In 2009, my client wished to internationalize their product, primarily to reach
into the Chinese market. As a huge legacy system, it was not feasible to
find each text that was embedded in the code and change this to use a resource.
Also this would make it difficult for developers to maintain the code.
Instead I architected a dictionary-based translation system. This had a
combination of exact phrases with their translations and patterns, that when
matched, offered a translation that encompassed variable parts of messages.
I created a tool to extract captions and messages from the source code to be
consolodated into a dictionary that could be translated. Forms and
messages are translated before they are shown, as well as logged messages, which
remained in english in the database for easy support. The translation
infrastructure would keep track of missing translations that arose from new
messages and complex message assembly occurring across multiple procedures.
These missing translations could be reviewed and merged into the translation.
It was also possible for clients to customize their translations for region or
company specific purposes. We also had to deal with culture-aware date and
decimal formats for input and output, while keeping machine data in invariant
forms that can be passed between machines without corruption by incompatible
I provided a pragmatic and elegant solution that took 10% of the original estimate to develop.
In 2011, another client wished to internationalize their product, primarily for France. I was able to apply my previous experience to fit into their architecture.
I’ve had a lot of involvement in analyzing running applications, particularly focusing on performance. With systems spanning multiple machines, traditional text file logging was too hard to comprehend. So I created customized logging code that would include the computer name, process name and thread id. This is collates and displays details on an interactive time line showing the procedure stacks where the user can zoom in to the area of interest. Color highlighting helps visualize occurrences of procedures and database queries significant to the task at hand. This tool has been invaluable in analyzing performance across multiple machines. It has also presented the exact state and clearly revealed the cause of a deadlock condition caused by multiple processes. Using this tool, we identified performance improvement candidates and could project the gains to be made by eliminating them. This delivered a 7% performance improvement across the product.
In 2010 I wrote my first installer using Windows Installer XML (WIX), which is a free installer platform from Microsoft. It provides low-level access to Windows installer features. This provides a simple template for professional installers, suitable for professional or end-user consumption.
During development, prior to having installers available, my client’s product has many bits and pieces that need to be installed on development computers to make it work. I established a tool with a plug-in architecture for updating a machine with the latest stuff. This is now used on developer, tester or client machines through one intuitive wizard. When progressing, the progress page shows several items being refreshed simultaneously. I use standard controls with custom drawing to do cool stuff such as the pie-balls for progress in a standard list control.
In 2008 I reviewed a client’s installation process, to make the product work on Server 2008 R2 and Windows 7 and 64-bit. As part of this I created three intuitive wizards to automate several pages of manual installation instructions. One of these was the database maintenance application, which I upgraded from VB6 and rewrote the user interface. I put the most important functionality up front, with the more technical options tucked out of the way of inexperienced users. Previously, users were confronted with a scary technical tool for upgrading the database, with which each database had to be upgraded individually. As the QA team had hundreds of databases, several hours of manual effort were required each time new database changes were made. I provided a solution, which allowed the simultaneous upgrade of as many databases as desired. Database upgrades could be run by one person, taking 10 minutes of attention and an hour to run in the background. With this project taking around a month, the investment was returned within a few months for our QA team. My client’s customers were most impressed with the new look and feel of the installation processes.