Freight Exchange is currently in the process of caring out pieces of its core Java-based platform and turning them into Lambda-based microservices.
"We do that mainly because it makes us inherently scalable - we don't really have to worry about scaling anymore; it just takes care of that for us," Hann said.
"We have these functions that do one thing and that's all they do, and we can then call them in lots and lots of different ways depending on what we're using them for."
A basic example given by the CTO is code dealing with PDFs of labels that need to be sent to a carrier.
The labels can be sent in several different ways: Via email, uploaded on to Freight Exchange's servers or possibly the server of a carrier via their API.
"That's a bit of code that we access from lots and lots of different places," Hann said.
That code has been stripped out and turned into a standalone function, so when the labels are ready, the function can be used to send them via the appropriate method for the relevant carrier
"The system that's generating the document doesn't need to know anything about that - it just says send this label."
The microservices approach has also allowed the company to use different languages to build different parts of its platform. Although main website and core application are Java-based, Freight Exchange has also used Python and Node.js.
Hann said that in the year and a half or so since the company has been getting "material volumes" of bookings through its site, the number of transactions each month has increased by a factor of 100. But, with the increasingly reliance on serverless computing, hosting costs have remained about the same.
"I think what's quite important about it is that it actually starts linking your IT costs to your transaction costs," the CTO added.
"Instead of having your service in there churning away 24/7 and it may handle one job or it may handle a million jobs - it makes no difference to your costs - we are now saying we know what it costs to process one job in terms of the resources it consumes."
"You never quite get there - like our core website can't be serverless, it has to be up 24/7 so it doesn't lend itself to that," the CTO added.
"The key thing is: We haven't had to increase the number of servers we run, despite the growth of the business. We have the same number of instances we had now as we had when we started - which is two."
Hann said he believes that cost model of serverless architecture will be a compelling driver for broader adoption by businesses.
Sign up for CIO Asia eNewsletters.