Legacy technology is often debated. Recently, we’ve seen many high-profile privacy intrusions (Cambridge Analytica with Facebook) and system malfunctions (mass flight cancellations in Europe in March 2018), yet we are amassing even more data into increasingly large black boxes (how much data did Equifax have again?). Decentralized and distributed systems have started becoming topical and popular. When will the unbreakable black box finally clear for new systems?
Like any large-scale shift, it’s far easier to see the direction than it is to predict anything close to an accurate timing of said shift. Legacy, monolithic, colossal systems are clearly not the future, and more cracks in the foundations are going to be made public in the months and years to come. However frail these systems may be, getting a large-scale shift of any system requires not only years of R&D, it also requires years of validation. But it is happening.
Short of directly overhauling information systems, we can see organizations launching their greenfield projects based on modern technologies, often a step removed from their core infrastructure. For instance, look at what Goldman Sachs did with Marcus, which their technology team has been quite open about. They then rolled out the Marcus platform into GS Bank, but only after taking it to market as a relative standalone. This makes sense – why hold a novel new approach down with archaic systems at the start when the value is minimal, when you can simply link it back in the future? Should it actually be seen as valuable?
Yet, the more new greenfield projects an organization has and launches, the more crucial the question of interconnectedness and interoperability becomes. Tying together these new projects, product, and services, we see a new infrastructure