Duco processes one billion records every four days; hits customer satisfaction highs.
London, 12 December 2019 – Duco, the global provider of self-service data integrity and reconciliation services, has announced it is now checking the integrity of one billion data records every four business days for its clients. As we head into the ‘decade of data’, Duco says this significant increase is a clear sign that the finance sector is now fully-embracing the potential of a cloud-based service to aggregate, validate and reconcile data, and is increasingly using intelligent systems for on-demand data integrity and insight.
Launched in April 2013, Duco is a trailblazing cloud-based service that processes data for 80+ financial institutions and insurance companies. In 2018, the firm announced a $28m Series B investment. Duco now reports that 14 of the top 30 banks by total assets have now signed up, as well as asset managers totalling $5.5 trillion of assets under management, and many challenger FinTech payment services.
The only fully cloud-based, Software as a Service (SaaS) offering in its space, Duco empowers business users to work with complex data directly. It cleans up and reconciles the data using machine learning and ensures it is safe, reducing operational risk, cost and enabling firms to focus on using data for revenue growth. A relentless focus on end user empowerment has resulted in Duco recently reaching a Net Promoter Score (NPS) of 65. NPS is an industry standard customer satisfaction measure and scores of 65+ are more typically seen in the most loved consumer brands rather than enterprise software.
“When Duco first started in April 2013, we were told that a cloud-based service could never get off the ground in finance, let alone succeed. It took us 20 months to earn trust and receive the first billion records of data,” said Christian Nentwich, CEO of Duco. “But our end user community has exceeded all expectations: we have now received that amount in just under four days, with the rate of submission increasing fast. This is generating a flywheel effect for our machine learning systems, which get better with every line of data. The result is a better work experience for our users, and significant time savings.”