Windows Servicing Platform
My last four years at Microsoft were spent working on the servicing stack used to install and update Windows system bits. The stack is used in applications from Windows Update, to setup.exe, to dll registration. This team gets the credit for my philosophy of exponential over iterative improvements. I vividly remember being faced with one such task. A Windows 7 PC which had been installing updates for two years could have grown our repository of OS files from an initial 4GB to well over 200GB of disk space. With the move towards tablets, devices, and SSDs with high-cost disk space, we were tasked with getting the 2-year later OS footprint down to 10GB. A daunting task to say the least. Bundled into the first update for Windows 8 was our updated code which brought most PCs down to under 8GB. For diligence to our customer base, we even back-ported the changes and put it in an update for Windows 7 delighting customers world-wide who woke up and noticed they had 100s of GBs of new free space like magic.
Along with new challenges like reducing footprint, improving stability, and handling app-store installs, comes the dreaded world of legacy support. For over a year, I owned Window's Fusion technology which promotes the use of assembly manifests to avoid dll hell during registration. The test footprint for this space was so enormous that our labs couldn't handle the execution. In response, I put myself on the front edge of coverage-based smart testing. Using code-coverage runs, I could catalog which tests executed which code blocks. If and only if a code block was changed would we then run the tests that exercised that block. The result was an 80% reduction in test execution for the legacy framework with only a 6% delay in detecting regression bugs. A huge win for this space.
Along with new challenges like reducing footprint, improving stability, and handling app-store installs, comes the dreaded world of legacy support. For over a year, I owned Window's Fusion technology which promotes the use of assembly manifests to avoid dll hell during registration. The test footprint for this space was so enormous that our labs couldn't handle the execution. In response, I put myself on the front edge of coverage-based smart testing. Using code-coverage runs, I could catalog which tests executed which code blocks. If and only if a code block was changed would we then run the tests that exercised that block. The result was an 80% reduction in test execution for the legacy framework with only a 6% delay in detecting regression bugs. A huge win for this space.
Microsoft Software Licensing
Also known as the impossible task of trying to stop the physical and administrative owner of a computer from hacking your bits - and my first big-time software job. While in Microsoft's Software Licensing org, I helped develop countless technologies to help harden, detect, and prevent software piracy including anti-tampering, obfuscation, inbox bit signatures, and validation pipelines.
Amongst my favorites was the pkey2009 algorithm which became the new company-wide encryption standard for product keys. Our goal was to both stay ahead of decryption technologies and speed product key acceptance during activation - a process that could take over a minute in certain situations on the existing pkey2005 algorithm. As an SDET this was an interesting problem needing a test framework that could work for both penetration testing and performance testing.
Another interesting challenge was our hardware ID software suit. This was the API base used to determine if the same product key was being used on multiple different machines. Key problems in test included how to virtualize hardware for automated testing on a system designed to detect tampering through virtualized hardware, and establishing a physical test environment that emulated a large enough sample set of real-world hardware that could be expected to be seen in the entire Windows install base (100s of millions of PCs). While it's acceptable for some software to ship with bugs out of the box, anti-piracy technologies must be bug-free or there will be hacks before the first box leaves the store shelves.
Amongst my favorites was the pkey2009 algorithm which became the new company-wide encryption standard for product keys. Our goal was to both stay ahead of decryption technologies and speed product key acceptance during activation - a process that could take over a minute in certain situations on the existing pkey2005 algorithm. As an SDET this was an interesting problem needing a test framework that could work for both penetration testing and performance testing.
Another interesting challenge was our hardware ID software suit. This was the API base used to determine if the same product key was being used on multiple different machines. Key problems in test included how to virtualize hardware for automated testing on a system designed to detect tampering through virtualized hardware, and establishing a physical test environment that emulated a large enough sample set of real-world hardware that could be expected to be seen in the entire Windows install base (100s of millions of PCs). While it's acceptable for some software to ship with bugs out of the box, anti-piracy technologies must be bug-free or there will be hacks before the first box leaves the store shelves.