Sales Insider - Archives with Kieran Seaward - Datactics https://www.datactics.com/category/blog/sales-insider/ Unlock your data's true potential Sun, 28 Jul 2024 22:33:54 +0000 en-GB hourly 1 https://wordpress.org/?v=6.7.2 https://www.datactics.com/wp-content/uploads/2023/01/DatacticsFavIconBluePink-150x150.png Sales Insider - Archives with Kieran Seaward - Datactics https://www.datactics.com/category/blog/sales-insider/ 32 32 Building a Data Fabric with Datactics Self-Service Data Quality https://www.datactics.com/blog/how-data-quality-can-help-build-a-data-fabric/ Tue, 19 Apr 2022 15:38:00 +0000 https://www.datactics.com/?p=18316 If you consider data as the lifeblood of your organization, trying to manage it in a static, distributed fashion seems like a challenging and almost futile exercise. Adopting a data fabric or mesh approach is important as it enables better management of data in-motion as it flows throughout the organization. Moreover, it allows for potential […]

The post Building a Data Fabric with Datactics Self-Service Data Quality appeared first on Datactics.

]]>
data fabric integration
Data Fabric

If you consider data as the lifeblood of your organization, trying to manage it in a static, distributed fashion seems like a challenging and almost futile exercise. Adopting a data fabric or mesh approach is important as it enables better management of data in-motion as it flows throughout the organization. Moreover, it allows for potential to add value through a greater variety of use cases.

Any organisation which values their data as an asset would benefit from a holistic approach to data management. By considering a data fabric implementation, businesses can unlock a more efficient, secure and modernised approach to data analysis and management.

What is a data fabric, and how does it differ from a data mesh?

Data fabric has been defined by Gartner as,

“…a design concept that serves as an integrated layer (fabric) of data and connecting processes. A data fabric utilizes continuous analytics over existing, discoverable and inferenced metadata assets to support the design, deployment and utilization of integrated and reusable data across all environments, including hybrid and multi-cloud platforms.”

Gartner, 2021

Specifically, Gartner’s concept means that both human and machine capabilities can be leveraged so that data can be accessed where it resides.

It is a term coined by Noel Yuhanna of Forrester, whereas Mesh was noted down by Zhamak Dehghani of a North American tech incubator, Thoughtworks. Essentially they’re two similar but notably different ways of expressing how firms are approaching, or should approach, their data architecture and data management estate, usually comprising bought and built tools for data governance, data quality, data integration, data lineage and so on.

Both approaches describe ways of solving the problem of managing data in a diverse, often federated and distributed environment or range of environments. If this seems like a very conceptual problem, perhaps a simpler way is to say that they are ways of providing access to data across multiple technologies and platforms.

In the case of a data fabric, it can assist with the migration, transformation and consolidation required to coalesce data where this meets the business need (for example, in migrating to a data lake, or to a cloud environment, or as part of a digital transformation programme). In its thorough research piece, targeted at data and analytics leaders exploring the opportunity to modernise their data management and integration approach, Gartner has detailed some benefits in a theoretical case study based on a supply chain leader utilising a data fabric:

“…(they) can add newly encountered data assets to known relationships between supplier delays and production delays more rapidly, and improve decisions with the new data (or for new suppliers or new customers).”

Gartner, 2021

Importantly, Gartner does not believe that a data fabric is something that can be built in its entirety, or likewise bought off-the-shelf as a complete solution. In fact, it is quite adamant that data and analytics leaders should be pursuing an approach that pairs best-of-breed solutions commercially available in the market with the firm’s own in-house solutions.

“No existing stand-alone solution can facilitate a full-fledged data fabric architecture. D&A leaders can ensure a formidable data fabric architecture using a blend of built and bought solutions. For example, they can opt for a promising data management platform with 65-70% of the capabilities needed to stitch together a data fabric. The missing capabilities can be achieved with a homegrown solution.”

Gartner, 2021

Besides Gartner, other industry experts have written on the differences between data fabric and data mesh as being primarily about how data is accessed, and by whom. James Serra of EY has said that data fabrics are technology-centric, but data meshes are targeting organisational change.

A data fabric might therefore overlay atop various data repositories, and bring some unification in management of the data. It can then provide downstream consumers of the data – stewards, engineers, scientists, analysts and senior management – with meaningful intelligence.

Data meshes however are more about empowering groups of teams to manage data as they see fit in line with a common governance policy. At the moment, lots of companies employ Extract, Transform and Load (ETL) pipelines to try and keep data aligned, and consistent. Data meshes advocate the concept of “data as a product” – rather than simply a common governance policy, data can be shaped into products for use by the business.

The Datactics view on the benefits of a Data Fabric approach

In our experience, there are a wide range of business benefits to adopting a data fabric. Generally, organisations can benefit from a unified data approach as it fundamentally simplifies access to enterprise data and reduces the amount of data silos. Having distributed data across an organisation can hinder efficient operations, but by making data accessible to stewards and data engineers across the organisation, businesses can benefit from greater interoperability and as a result, make better decisions.

In the context of data quality specifically, a data fabric implementation provides the optimum architecture to apply data quality controls to a large volume of critical data assets, helping you achieve a more unified view of your data quality. Monitoring data in transit (compared to data at rest) helps to react more quickly to data quality issues and is a step towards a more proactive data quality approach.

However, a data fabric can create an enterprise-wide demand for uniformity of technologies, which may or may not suit the business needs or business model.

The Datactics view on the benefits of a Data Mesh approach

Because data meshes prioritise organisational change over the adoption of more technology, a data mesh is an approach that is typically favoured by organisations that are not intent on pursuing top-down governance over bottom-up agile working methodologies. It doesn’t always mean that no new technology will be required to design and deploy a data mesh, because each function will have to be able to create and deliver data-as-a-product to an agreed level of quality and in compliance with internal and external standards. Additionally, a data mesh will suit teams who do not have their own coders, and instead rely on business and subject matter expertise allied to no-code tools for a wide range of data management and data quality operations.

In this case, there is less call for technology uniformity, and more freedom for distributed teams to build systems that meet their own needs, albeit with cross-team and cross-function governance provisions.

Data Fabric and Integration

Gartner explains that a robust data fabric must facilitate traditional methods of data integration, such as processing data and ETL. It also must be capable of supporting all users, from data stewards to business users wanting to self-serve in their data analytics

Similarly, by leveraging machine learning, a data fabric monitors existing data pipelines and analyses metadata in order to connect multiple data sources from across an organisation. This makes it much easier for a data scientist to interpret the information and improve data analytics.

By its very nature, a data fabric needs to support integration and this is where the Datactics data quality solution can add value when building a data fabric framework.

Data Mesh and Integration

There’s less of a priority on data integration for data meshes, however interoperability of the distributed data management environments is an absolute must. If components of a data management platform do not interoperate, or have no API connectivity (for example), then it is going to be time to explore alternatives that do!

How the Datactics solution complements Data Fabrics and Data Meshes

As highlighted in this year’s Gartner Magic Quadrant, Datactics is a ‘best of breed’ Data Quality tool – we do Data Quality exceptionally well (ask our clients!). However, Datactics recognizes the fact that Data Quality is only one piece of the overall data management puzzle and data integration is a key component in our delivery process.

In order to help our clients build a data fabric architecture, we must connect easily with other tools. Being able to integrate with other areas of the data management ecosystem is something Datactics does well. Our solution integrates seamlessly with solutions ranging from Data Governance to Data Lineage and Master Data Management.

Integration is fundamental to the design of our platform, which offers frictionless connectivity to other vendor tools via API and other means. We don’t plan on adding data catalogue or data lineage capabilities to the Datactics platform. However, we will connect with existing ‘best in breed’ tools using an open metadata model. This therefore creates an integrated system of best of breed data management capabilities.

Datactics are no strangers when it comes to connecting with a variety of data sources and systems. The very nature of Data Quality means that Datactics needs to connect to data from across a client’s entire estate- including cloud platforms, data lakes, data warehouses, business applications and legacy systems. Connecting to these data sources and systems needs to be robust in order to perform data quality measurement and remediation processes.

How does Datactics approach integration with specialist data management tools?

When developing or enhancing its data management programme, we appreciate that an organisation will want to integrate a new solution seamlessly with (potentially) multiple other data systems and vendors. This is helped by the abundance of connectivity options available in the Datactics platform, to integrate with existing systems and vendors in order to make it easier for businesses to establish a sustainable Data Fabric.

A good example of where integration can add real business value is through the combination of Data Quality and Data Lineage. The automated technical lineage information provided by Manta provides Datactics with the ‘coordinates’ to point Data Quality rules to a larger volume of critical data elements within a data set. As a result, data quality is more effectively rolled out across an organisation.

Similarly, as Datactics measures data quality in-motion across multiple source systems & business applications, DQ metrics can be visually represented in the excellent metadata model visualisation provided by Solidatus. This allows users to identify the root cause of a data quality issue very quickly and trace the downstream impacts on a client’s business processes.     

Another natural area of integration is between Data Quality and Data Governance systems. Data ownership metadata & data quality rules definitions housed in these systems can be pulled into Datactics via REST API. Meanwhile, metadata on the rules input and data quality metrics on the data assets can be pushed back into the Governance or Catalog system.

Other systems Datactics connects with are Business Intelligence and visualisation tools, ticketing systems and Master Data Management systems. For instance, the software ships with out-of-the-box connectivity to off-the-shelf tooling such as Qlik, Tableau, and PowerBI on the visualisation side, and Jira and Service Now on the ticketing front.

Next steps

If you are developing a data management framework, exploring data fabric or data mesh architecture. or are simply seeking to understand open integration of best-of-breed data quality technologies and would like to hear more about our integration capabilities, please reach out to Kieran Seaward or contact us.  

The post Building a Data Fabric with Datactics Self-Service Data Quality appeared first on Datactics.

]]>
Meet Datactics at the DIT New York RegTech Roadshow! https://www.datactics.com/blog/marketing-insights/from-uk-to-us-dit-regtech-roadshow/ Fri, 19 Feb 2021 14:58:10 +0000 https://www.datactics.com/?p=13998 Datactics is a part of the cohort of companies joining the DIT Virtual RegTech themed roadshow this week. The DIT’s RegTech Roadshow takes place virtually with the aim of providing ten innovative UK-based RegTech companies with the opportunity to meet industry stakeholders, regulators and potential partners.  Datactics will be visiting New York virtually alongside Acin AMPLYFI, AutoRek, ComplyAdvantage, Finreg-E, FNA, Solidatus, Suade Labs, and TAINA Technology Limited. The delegation this year is one of the […]

The post Meet Datactics at the DIT New York RegTech Roadshow! appeared first on Datactics.

]]>
RegTech Roadshow US

Datactics is a part of the cohort of companies joining the DIT Virtual RegTech themed roadshow this week.

The DIT’s RegTech Roadshow takes place virtually with the aim of providing ten innovative UK-based RegTech companies with the opportunity to meet industry stakeholders, regulators and potential partners. 

Datactics will be visiting New York virtually alongside Acin AMPLYFI, AutoRekComplyAdvantageFinreg-EFNASolidatusSuade Labs, and TAINA Technology Limited. The delegation this year is one of the most diverse ever, with representation from every region across the UK, including three female-founded companies! On top of this, the delegation has serviced nearly every Tier 1-2 financial institution and raised a staggering $100 million in funding combined. Finally, each of the 10 companies offers a unique solution built to help firms conduct their business safer, easier, and more equitably.  

We thought we would sit down with Kieran Buchanan, Business Development Executive at Datactics, to find out more about the roadshow and why Datactics is delighted to be involved. 

This event being a US RegTech roadshow will have US-based industry stakeholders, regulators and potential partners – this is a market Datactics is keen to establish themselves even further in, isn’t it? 

It is most certainly; I have seen some large banking organisations have a presence at this roadshow so it’s our prerogative to introduce our services as we already have some use cases in the US market. This will allow them to know what we provide in the space of data quality and the data management infrastructure as a whole. US is a significant market bursting with opportunities to invest time and effort into. 

What conversations do you hope to have with the industry stakeholders, regulators, and potential partners? 

There are a lot of innovation-based attendees so I think there will be a natural introduction to who Datactics is and how we can bring value. We’re particularly looking forward to demonstrate how our solution will help organisations that are keen to develop a good data quality culture. Our Self-Service Data Quality (SSDQ) platform will be fantastic for those organisations that want to get ahead of the curve by ensuring they have the proper building blocks in place when it comes to the quality of their valuable data assets.

Throughout the week, what would be the best way for attendees to get in touch with you? 

I have a meeting set up on the platform, Meetaway, so I’ll be contactable via video call using that platform. My profile is on there, fully populated! You can also get in touch with me via email: kieran.buchanan@datactics.com and of course, via LinkedIn.  

There are some great panel talks happening this week! Are there any talks you are keen to go to? 

There’s a talk on the topic of the future of financial services, Regtech and Compliance which I am keen to attend. Another interesting talk would be ‘How US regulators are adapting to new technologies‘ as it reflects the work we do with SSDQ. We are currently helping business users at financial services organisations get their data right for a wide range of regulatory reports, whether that be BCBS 239 or MiFID II, or country-specific deposit guarantee scheme reporting.

If you are attending the event, feel free to reach out Kieran at any time, via email, LinkedIn or Meetaway.

The post Meet Datactics at the DIT New York RegTech Roadshow! appeared first on Datactics.

]]>
Data will power the next phase of the economy: DMS USA lookback – Part 1 https://www.datactics.com/blog/sales-insider/data-powered-economy/ Wed, 04 Nov 2020 11:20:00 +0000 https://www.datactics.com/?p=12893 Last September, Kieran Seaward, our Head of Sales, delivered a keynote at the virtual DMS USA on a data powered economy. In his keynote he unpacked: We are still living with the many challenges presented by COVID-19, including a wide range of changes to the way business can be conducted. At Datactics we have been […]

The post Data will power the next phase of the economy: DMS USA lookback – Part 1 appeared first on Datactics.

]]>

Last September, Kieran Seaward, our Head of Sales, delivered a keynote at the virtual DMS USA on a data powered economy. In his keynote he unpacked:

  • The impact of COVID on shifting business priorities, focussing on how existing data management processes and data architectures have changed – as well as problems encountered along the way 
  • Case studies demonstrating best practice approaches to managing data-driven processes and how to create impact and add value in a post COVID era

We are still living with the many challenges presented by COVID-19, including a wide range of changes to the way business can be conducted. At Datactics we have been really encouraged that engagement with the market is still strong; since March, and the start of many lockdowns, we’ve conducted many hundreds of calls and meetings with clients and prospects to discuss their data management and business plans. This article is based on a lot of our key findings from these calls and reflects the priorities many data-driven firms

Data will power the next phase of the economy – good or bad?

As global economies look to get back on their feet, it’s clear that data quality is more important than ever before. Whether it’s data for citizen health, financial markets, customer or entity data, or any other type, economies and firms will either be powered by a solid platform of high quality, reliable data, or they are going to grow more slowly, built on poor data quality. It’s not an overstatement to say that what the next phase of economic growth looks like will rely entirely on the decisions and actions we take now.

Kieran’s keynote was underpinned by the fact there’s never been a greater need to get your data quality in order. With some firms really grasping this opportunity, change is visible. However, many have encountered the same old problems with the state of their data, and this can inhibit change.

It is necessary to get data quality under control?

Kieran highlighted that pre-pandemic, MIT Sloan Management Review commented that poor data costs on average between 15-25% of revenue. This figure reaffirms that there is no better time than now to improve the data quality foundation. The long-term future looks bleak if it is built on the quality of data that we have right now!

What is the importance of a foundation of good data quality?

Both before and since the financial crisis of 2008, there have been many conversations reiterating the importance of data quality foundations. Moving forward particularly after COVID-19, a data quality foundation can not only help you get off to a positive start amidst uncertainty, it can also ensure resilience for the future.

Kieran referred to the many conversations that he has had with wealth management, asset management and insurance firms this year. Following hot on the heels of the most innovative investment banks, more and more of these firms are seriously considering what a data governance and data quality framework should deliver.

Redesigning those data architectures that aren’t delivering

Kieran went on to detail that there have been a large number of firms who have told him that the ‘Waterfall’ process of logging tickets into software development and IT to improve data quality simply isn’t moving quickly enough for them, or with the right level of accuracy, and as a result, they’re evaluating DataOps processes and tooling. Stuart Harvey, CEO at Datactics spoke in-depth on this at the European Summit, about having every part of the data management journey at the table on data issues, and this is something one client we signed during the UK lockdown is now putting in place as their data management foundation.

On the growth of DataOps-style approaches, Kieran said:

We’ve been encouraged by the number of firms we’ve spoken to who are keen to start moving away from script-based rules. They’re federating data quality management out from centralised IT and into the business teams so that business users – who already know the data and what good looks like – can self-serve for high-quality, ready-to-use data in their line of work. This covers everything from building rules themselves, through interrogating the data itself in dashboards, right down to making or approving fixes in the data within their domain.

What comes first? Data Governance or Data Quality?

Kieran recently wrote a blog on this topic as it he noted that it comes up in every client engagement! To illustrate the importance of data quality, he gave an analogy:

Imagine you are a kid, and your mother has asked you to tidy your bedroom. Your mum returns a few hours later and the room is still a mess – would you get away with saying “Yes Mum, but look – I’ve made an inventory of where everything should be.” I imagine the response you’d get would be something along the lines of “well done but can you please tidy your room now?!

Kieran used this story to draw attention to the fact that it is vital to consider data quality first and foremost, as having a large inventory with disorganised data will lead to key data being either inaccessible or difficult to find.

Making a holistic approach to data management

There are a number of building blocks that make up a holistic approach to data management including data quality, master data management, business glossary/data dictionary, metadata management, and so on. As Kieran reiterated in his keynote, using intelligent integration via APIs, it is now possible to build the next generation of data management platform orchestrate by leveraging ‘best of breed’ technology components from a multitude of capable vendors. For example,  recently we have explored a more orchestrated approach with vendors like Solidatus, where Datactics provides data quality and Master Data Management, and Solidatus provides the Governance and Lineage pieces.

Start small, think big

Kieran’s session reinforced that if you are exploring introducing new capability/ functionality, completing a proof of concept is a well-proven, low-risk means of proving the efficacy of the software and associated professional services. If the scope is well considered and defined on a specific use case that is causing pain, the quick turnaround of results has the potential to create real impact.  This real impact will ultimately help to make the business case a lot stronger.

This is how Datactics has engaged with the vast majority of our clients and we have successfully delivered remote proof value projects during lockdown that are now being rolled into production.

To use the analogy about tidying your room… Once you start to clean your desk or wardrobe, you can quickly see that cleaning the rest of the room doesn’t seem as daunting a task, if you break it into chunks.

In conclusion, at Datactics, we believe that it doesn’t matter where you start, so long as you make the start!

It is necessary to get your data quality under control and the importance of data quality foundations have never been more paramount. Our platform can help you to take away the hassle of internal “roadblocks” in IT administration, and we can remove the headache from a manual review of data records failing or breaking rules.

Kieran seaward, head of sales at datactics

Pre pandemic, MIT Sloan Management Review commented that poor data costs on average between 15-25% of revenue. This figure reaffirms that there is no better time than now to improve the data quality foundation.

Our platform can help you, get in touch today.

In the next blog, we will be unpacking more themes from Kieran’s keynote ‘A Data-Driven restart’; we will be looking at taking a long-term view; seeking impact that matters, and finally on the budget and implementation approach.

If you want to watch Kieran’s keynote in full, you can do by checking out this link.

The post Data will power the next phase of the economy: DMS USA lookback – Part 1 appeared first on Datactics.

]]>
Data Governance or Data Quality: not always a ‘chicken & egg’ problem https://www.datactics.com/blog/sales-insider/market-insights-data-governance-or-data-quality-not-always-a-chicken-egg-problem/ Thu, 18 Jun 2020 13:00:33 +0000 https://www.datactics.com/market-insights-data-quality-vs-data-governance-not-always-a-chicken-egg-problem/ In this  blog with Datactics’ Head of Sales, Kieran Seaward, we dive into market insights and the sometimes-thorny issue of where to start. Data Governance or Data Quality is a problem data managers and users will fully understand, and Kieran’s approach to this is influenced by thousands of hours of conversation with people at all […]

The post Data Governance or Data Quality: not always a ‘chicken & egg’ problem appeared first on Datactics.

]]>
In this  blog with Datactics’ Head of Sales, Kieran Seaward, we dive into market insights and the sometimes-thorny issue of where to start.

kieran seaward market insights

Data Governance or Data Quality is a problem data managers and users will fully understand, and Kieran’s approach to this is influenced by thousands of hours of conversation with people at all stages of the process, all unified in the desire to get the data right and build a data culture around quality and efficiency. 

Following hot on the heels of banks, we are seeing a lot of buy-side and insurance firms on the road to data maturity and taking a more strategic approach to data quality and data governance, which is great.  Undertaking a data maturity assessment internally can throw up some much-needed areas of improvement regarding an organization’s data, from establishing a data governance framework, to updating existing data quality initiatives and improving data integrity. 

From what I hear, the “data quality or governance first?” conundrum is commonly debated by most firms, regardless of what stage they are at in a data programme rollout.

Business decisions are typically influenced by the need to either prioritise ‘top-down’ data governance activities such as creating a data dictionary and business glossary, or ‘bottom-up’ data quality activities such as measurement and remediation of company data assets as they exist today from data sources.  However, achieving a data driven culture relies on both these initiatives existing concurrently. 

In my opinion, these data strategies are not in conflict but complementary and can be tackled in any order, so long as the ultimate goal is a fully unified approach.  

I could be biased and say those market insights derived from data quality activities can help form the basis of definitions and terms typically stored in governance systems: 

data quality or data governance

Figure 1 – Data Quality first

However, the same can be said inversely, data quality systems can benefit from having critical data elements defined and metadata definitions to help shape measurement rules that need to be applied: 

data quality or data governance

Figure 2 – Data Governance first

The ideal complementary state is that of Data Governance + Data Quality working in perfect unison, i.e. :

  • Data Governance system that contains all identified critical data elements as well as definitions to help determine which Data Quality validation rules are applied to ensure they meet the definitions;
  • Data Quality platform that validates data elements and connects to the governance catalogue to understand who the responsible data scientist or data steward is, in order to push data to them for review and/or remediation of data quality issues.
    The quality platform can then push data quality metrics back into the governance front-end that acts as the central hub/visualization layer displaying data visuals. This either renders data itself or through connectivity to third parties such as Microsoft PowerBI, Tableau, or Qlik. 

data quality or data governance

Figure 3 – The ideal, balanced state

In the real world, this decision can’t be made in isolation of what the business is doing right now with the information they rely on:

  • Regulatory reporting teams have to build, update and reconfigure reports in increasingly tighter timeframes.
  • Data analytics teams are relying on smarter models for prediction and intelligence in order to perform accurate data analysis.
  • Risk committees are seeking access to data for the client, investor, and board reporting.  

If the quality of this information can’t be guaranteed, or breaks can’t be easily identified and fixed, all of these teams will keep coming back to IT asking for custom rules, sucking up much-needed programming resources.

Then when an under-pressure IT can’t deliver in time, or the requests are conflicting with one another, the teams will resort to building in SQL or trying to do it via everyone’s favourite DIY tool, Excel. 

Wherever firms are on their data maturity model or data governance programme, data quality is of paramount importance and can easily run first, last or in parallel. This is something we are used to helping clients and prospects with at various points along that journey, whether it’s using our self-service data quality & matching platform to drive better data into a regulatory reporting requirement, or facilitating a broad vision to equip an internal “data quality as-a-service” function.

My colleague Luca Rovesti, who heads up our Client Services team, goes more into this in Good Data Culture

I’ll be back soon to talk about probably the number one question thrown in at the end of every demo of our software:

What are you doing about AI?

Click here for the latest news from Datactics, or find us on LinkedinTwitter or Facebook

The post Data Governance or Data Quality: not always a ‘chicken & egg’ problem appeared first on Datactics.

]]>
Unlocking Data Quality and Value to Power Innovation & Insight https://www.datactics.com/blog/sales-insider/data-management-summit-virtual-unlocking-data-value-with-kieran-seaward/ Wed, 29 Apr 2020 15:38:27 +0000 https://www.datactics.com/data-management-summit-virtual-unlocking-data-value-with-kieran-seaward/ At last week’s Virtual Data Management Summit, the A-Team’s CEO Angela Wilbraham sat down for a Q&A on all things data quality and unlocking data value with our Head of Sales Kieran Seaward. Topics covered in this interview are: – The biggest issues in data management right now (0:31) – Why data quality is such […]

The post Unlocking Data Quality and Value to Power Innovation & Insight appeared first on Datactics.

]]>
At last week’s Virtual Data Management Summit, the A-Team’s CEO Angela Wilbraham sat down for a Q&A on all things data quality and unlocking data value with our Head of Sales Kieran Seaward.

Topics covered in this interview are:

– The biggest issues in data management right now (0:31)

– Why data quality is such a significant issue in the current climate (3:00)

– The impact of AI on the landscape of data quality, especially in AML & KYC (5:13)

– Being optimistic on the outlook for 2020 (8:25)

The Data Management Summit explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue-enhancing opportunities. 

Putting the business lens on data and deep-diving into the data management capabilities needed to deliver on business outcomes.

Topics include: 

  • Shifting from defensive to offensive – aligning data with business strategy for revenue optimisation and operational efficiency
  • Establishing trust – Embedding data ethics into your data strategy
  • How to turn data lineage from a regulatory response into a business advantage
  • Reviewing the regulatory landscape  and future of regulatory reporting in Europe
  • Migrating to the cloud to create new capabilities for the business
  • The promise and potential of AI
  • Unlocking Data Value
  • Client onboarding – how developments in tech and automation can help optimise entity data management and improve KYC
  • DataOps methodology – why its the next big thing and significant for financial services”

A natural communicator, Kieran Seaward has over 15 years’ experience in technical sales, including at First Derivatives, across a wide range of ERP, data and automation solutions. Kieran is rightly known for forming strong and lasting customer relationships. He began his career by undertaking a BSc (Hons) degree in Technology and Design from Ulster University. 

Having won numerous awards in his sales role in jewellery, it didn’t take long for Kieran to realise he had a skill for sales – as he grew in his expertise and his patter, he ended up in Datactics. One of the most exciting things about his role is that there is a huge amount of enlightenment brought to clients. He enjoys working towards solving a real issue and offering the client a solution that will transform the way they view data. 

Click here for the latest news from Datactics, or find us on LinkedinTwitter or Facebook

Kieran Seaward, Sales

Click here for more from Datactics, or find us on LinkedinTwitter or Facebook for the latest news.

The post Unlocking Data Quality and Value to Power Innovation & Insight appeared first on Datactics.

]]>