Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Serverless architecture delivers scale, savings for Freight Exchange

Rohan Pearce | Oct. 23, 2017
Despite a sharp increase in the use of its online platform, the logistics company’s CTO says that it still only requires two instances thanks to its extensive use of Amazon’s Lambda service.

"Most businesses have quiet time, whether it's overnight, weekends, during the day," he said.

The CTO said he has seen three broad stages of evolution in the journey towards serverless architecture.

"If you go back, you had on-premise servers: You went off, you bought a new server or severs and you had to persuade the CFO to hand over a load of money, and then you hoped it was going to do you for the next five years or so until you've depreciated it. If you're a growing business like us, that's kind of difficult to do that because you just don't know where you're going to be in two, three, four years' time.

"Businesses then went, 'Okay now let's go into the cloud'. We picked up our servers and moved them off to someone else's data centre. But you still have to plan for capacity; you still have to decide how many [instances], what size, etc. - things like that.

"The upside is, of course, that you can scale them very, very easily. You're not sinking a bunch of money into it, but you've still got to plan and monitor your capacity and make decisions about when's the right time to upgrade."

"As soon you go serverless you effectively say, 'I now don't care anymore. I'm just going to let the system run as often as it needs to and increase and decrease as I need it to,'" the CTO said.

 "That's where you're running functions as code, which is what we do a lot of, but there's also the service element," he added. For example, Freight Exchange uses AWS's NoSQL database service DynamoDB.

"We just consume a database product," Hann said. "How it gets delivered to us is irrelevant; we don't care. We just know it's going to be there. From our point of view we don't have dedicated resources - we simply pay for, crudely speaking, the number of queries we run or the capacity we consume, whichever way you want to look at it.

"People now are really getting away from having to worry about any aspect of the infrastructure. Getting it into someone else's data centre was great; you stopped having to worry about power, Internet and stuff like that, but you still had to worry about capacity. Now you're getting away from that."

The downside, he said, is that the move to serverless isn't a lift and shift exercise.

"You actually have to architect things to work in a particular way," he said. "You can't just magically make SAP run serverless - it doesn't work like that; that's just not a thing."

From that perspective it's easiest for organisations that are developing their own applications, Hann said. It's also a great approach for breaking down monolithic legacy systems, he added.


Previous Page  1  2  3  4  Next Page 

Sign up for CIO Asia eNewsletters.