Brendan McCarthy, Author at Datactics https://www.datactics.com/author/brendanmccarthy/ Unlock your data's true potential Mon, 25 Sep 2023 15:56:29 +0000 en-GB hourly 1 https://wordpress.org/?v=6.7.2 https://www.datactics.com/wp-content/uploads/2023/01/DatacticsFavIconBluePink-150x150.png Brendan McCarthy, Author at Datactics https://www.datactics.com/author/brendanmccarthy/ 32 32 Uncovering the Root Causes of Data Quality Issues https://www.datactics.com/blog/uncovering-the-root-causes-of-data-quality-issues/ Mon, 25 Sep 2023 11:23:40 +0000 https://www.datactics.com/?p=22640   We all know data quality issues when we see them. They can often impair the ability of an organization to work efficiently and comply with regulations, plus it makes it harder to generate any real business value from messy data.  Rather than simply just measuring and patching up issues, we help our clients understand […]

The post Uncovering the Root Causes of Data Quality Issues appeared first on Datactics.

]]>

 

We all know data quality issues when we see them. They can often impair the ability of an organization to work efficiently and comply with regulations, plus it makes it harder to generate any real business value from messy data. 

Rather than simply just measuring and patching up issues, we help our clients understand why issues are surfacing by identifying the root cause and fixing it at the source. 

To some, this concept may seem like a pipedream. But many of our clients are recognizing the true value that this brings. 

Recently we have been exploring the industry-standard opinion on this, with Kieran Seaward taking to the stage at FIMA US in Boston earlier this year to host two roundtables on the topic: “Uncovering the Root Causes of Data Quality Issues and Moving Beyond Measurement to Action”. 

 During these roundtable discussions, data management practitioners from diverse backgrounds and industries (and with considerable experience in the field) shared their insights on dealing with poor data quality. Participants had the opportunity to learn from each other’s experiences and explore the actions they have taken to address this challenge. 

We were grateful for such candid and open conversation around what is a challenging topic. We wanted to share some of the key themes and insights that resonated with us during the sessions to help you get started:

1. Proactive (not reactive) Data Quality Management

Historically, data quality management has been viewed as a reactive measure to fixing bad data that has negatively impacted a report or a decision. Now, with the advancement in capabilities and technology, firms should look to become proactive and try to prevent issues from occurring in the first place- this will help restrict downstream impact on critical data elements. 

In other words, prevent the fire from starting rather than stopping the spread. 

But how can you achieve this? There are a number of key steps. 

  • Firstly, define data quality metrics and establish baseline measurements by setting targets for each metric and implementing a monitoring process. 
  • Then, you can conduct regular assessments to measure progress and develop improvement processes. 
  • Finally, implementing reporting and visualization mechanisms to communicate data quality measurement is important for highlighting the impact (and ROI) to business teams and senior leadership – this can be continuously iterated and refined, as necessary. 
2. Automation of manual processes

Automation plays a vital role in modernizing approaches to data quality management. Gone are the days when data scientists must spend 80% of their time wrangling with data to ensure it is fit for purpose. By using advanced techniques such as artificial intelligence, machine learning, and statistical modeling, practitioners can reduce the manual effort of boring, repetitive tasks and become more proactive in how they manage data quality. 

 Some technologies in the market offer automated profiling and recommended data quality rules for validation, cleansing, and deduplicating based on the column headers (metadata) as well as the underlying values. These tasks are often performed by writing complicated programming scripts, are unscalable, and can take considerable time. By automating this process, technical resources can be reallocated to more value-adding activities.

3. Root cause analysis of Data Quality issues

With an effective data quality measurement and monitoring process in place – which is by no means a trivial exercise to implement – you can start to identify trends of data quality breaks and act upon them. 

As a reference point, it’s helpful to consider the  Five Ws: 

What Data Quality break has occurred? 

Where  has the Data Quality issue occurred or surfaced? Has the DQ issue occurred at multiple points in the journey or propagated through other systems? 

When  is the break occurring? 

Who  is responsible for this element of information? Who is the data steward or data owner? 

Why  is it occurring? Hopefully, the previous four questions have shed some light on the reasons for the issue. 

 If you can accurately know the answer to each of these, you are in a good position to resolve, or fix, that data quality issue. 

AI can also help users to continuously monitor data quality breaks over time. By doing so, you can generate a rich set of statistics that enables analysis of data quality breaks and identify relationships between issues. This helps users predict future breaks, predict break resolution times, and understand the downstream impact of breaks.

4. Remediation

Remediation is uniquely important in the data management process because it does something about the problems being reported. With a comprehensive understanding of where and why breaks are occurring, you have the opportunity to put out that fire and fix your broken data. 

Some people are understandably hesitant about fixing data, but without acting, the rest of the process remains passive. 

We do not believe in handing off the responsibility to another team or system – but instead taking action to deal with and fix the breaks that have surfaced. 

We are currently working with a customer to fix those breaks at source, using a federated approach to solving data quality issues in the business by utilizing SME knowledge of what good looks like. 

This part of the story, where you are doing something proactive about making the data better, is the element that is often missing from solutions or processes that spend all their time noticing breaks or passively monitoring systems. 

 

Our recent engagement with industry experts at FIMA US in Boston reinforced the significance of proactive data quality management. With advancements in capabilities and technology, firms can now take a proactive approach. By defining data quality metrics, automating manual processes, conducting root cause analysis, and implementing remediation strategies, businesses can enhance the quality of their data and maximize its impact. 

We believe that taking ownership of data quality and embracing a proactive approach is the key to harnessing the full potential of your data for business success. In a world where data is a critical asset, it’s critical to move beyond merely noticing data quality breaks and to start actively working towards making data better. 

 

Kieran will be discussing data remediation at the Data Management Summit in New York on 28th September, speaking on a panel entitled ‘How to scale enterprise data quality with AI and ML’. You can also contact us to learn more here. 

 

The post Uncovering the Root Causes of Data Quality Issues appeared first on Datactics.

]]>
Why should you care about data quality? https://www.datactics.com/blog/why-should-you-care-about-data-quality/ Wed, 27 Apr 2022 10:17:07 +0000 https://www.datactics.com/?p=18678 Data quality may not be viewed as a particularly attractive topic of conversation at a dinner party or a company event, yet it is a critical step for organisations to maximise the value of their information and subsequent decision-making. It is therefore imperative that data leaders make every attempt to raise awareness and educate on […]

The post Why should you care about data quality? appeared first on Datactics.

]]>
Why you should care about data quality

Data quality may not be viewed as a particularly attractive topic of conversation at a dinner party or a company event, yet it is a critical step for organisations to maximise the value of their information and subsequent decision-making. It is therefore imperative that data leaders make every attempt to raise awareness and educate on why data quality is so important to an enterprise’s success – and ultimately make people care about data quality.

To begin, what exactly is meant to by data quality?

High quality data is defined as data that is fit for purpose, reliable and trustworthy. While organisations may maintain different quality standards, the Data Management Association UK (DAMA) propose that in order to be considered high quality, data must satisfy the following six dimensions; accuracy, completeness, uniqueness, consistency, timeliness and validity. These are often viewed as the foundations of data quality, however it can be argued that technical measurements and standards are determined by the use case.

Data problems can destroy business value. Recent research from Gartner shows that organisations estimate the average cost of poor data quality at over $10 million per year, a figure which will likely increase as the modern business environment becomes increasingly digitalized and unstructured data becomes harder to decipher. It’s also estimated that in 2021, eight out of ten companies admitted to struggling with data quality issues.

Senior leadership often neglect data quality as an organizational priority unless they are provided with an immediate reason to address the issue. Only when bad data quality is demonstrably proven to have a negative impact on the business will action be taken; bad data that casues complications to the initiatives and processes that senior management care about and are essential to the business.

Data quality lays at the foundation of all data management, data governance and data lineage processes; therefore it is imperative to get it right and ensure your business embraces a culture that cares about data quality.

In order to get your organization talking about the importance of good data quality, here are some recommendations:

First step: directly expose the pain caused to business operations by bad data quality;

  • This can be communicated to senior leaders and stakeholders through presenting a problem statement or a business case which they can own and align with the current strategic objectives of the firm. Connecting the business case with the organizational trajectory will help senior leaders understand the operational and financial benefits of addressing data quality issues. Also, this can help you answer the inevitable “why do we need to do this?” question.
  • Business leaders can also be engaged with end users to discuss their experiences, share anecdotes and help senior management emotionally connect with those at the firm who are regularly impacted by the consequences of poor quality data.
  • It may also be of merit to present historical events where firms have suffered from bad data quality, resulting in organizational disasters or heavy fines from financial regulators.

Completing these actions will help focus your business case on improving the health and quality of data that matters to the stakeholders who care about the problem and are prepared to help instigate an institutional change in organizational attitude’s to data quality.

Second step: Shift the company culture to one that cares about data quality management. Present key metrics which illustrate the tangible impacts of poor data quality to the organization and outline the resources that are required to make a change;

  • The key processes and process owners (needed to deliver on outcomes) must be identified and will likely span across multiple business areas. This will alleviate any concerns on siloed thinking between functional areas and elevate the role that data quality plays across the firm.
  • Work with key process owners to determine the key indicators which will be most critical to the newly identified business processes. This will help you define your critical data elements and the data quality associated with them e.g. quality of Customer Contact master data.
  • Data profiling and analysis of critical data elements can help to demonstrate their impact on business performance. This can be carried out via in-house or external data management tools, or using programming scripts such as Python or SQL. Results must be communicated and explained to business stakeholders to infer how data quality limitations on critical data elements can hinder organizational performance and how improvements can contribute to superior results.
  • Identify alternative areas of the business where the importance of good data quality is imperative as the business scales e.g. data science, advanced analytics and machine learning. Good data quality is fundamental to generate business value from these areas and therefore must be addressed.

By completing these two steps, your key business stakeholders will have developed an empathetic and rational understanding of the day to day operational benefits of data quality improvement. By creating an organizational culture that cares about data quality, business users will hopefully see data quality issues propelled to the forefront of the firm’s IT and data management strategy. Ideally, this may lead to additional funding and available resources to tackle data quality issues.

To have further conversations about the drivers and benefits of a Self-Service Data Quality platform, reach out to Brendan McCarthy.

And for more from Datactics, find us on LinkedinTwitter, or Facebook.

The post Why should you care about data quality? appeared first on Datactics.

]]>
New York With Invest NI’s Graduate To Export https://www.datactics.com/blog/new-york-with-invest-ni-graduate-to-export/ Mon, 11 Apr 2022 10:54:02 +0000 https://www.datactics.com/?p=18427 Brendan McCarthy is one of our Business Development Executives who joined Datactics in March 2021 as part of Invest Northern Ireland’s ‘Graduate to Export’ program. Partly funded by Invest NI, the scheme helps companies recruit a graduate into their business for 18 months, during which time the candidate will spend several months in a foreign […]

The post New York With Invest NI’s Graduate To Export appeared first on Datactics.

]]>
Brendan McCarthy is one of our Business Development Executives who joined Datactics in March 2021 as part of Invest Northern Ireland’s ‘Graduate to Export program. Partly funded by Invest NI, the scheme helps companies recruit a graduate into their business for 18 months, during which time the candidate will spend several months in a foreign market to help businesses compete internationally. The graduate to export programme offers candidates the opportunity to develop their careers, whilst studying for a postgraduate diploma and working in a local Northern Irish business.

Brendan’s main focus has been on identifying growth opportunities for Datactics in the United States and Canadian markets, primarily through sales activities, developing in-market relations and partnerships and attending trade shows. Just recently, he enjoyed his first trip across the pond.

On the onset of his one-year anniversary with Datactics, Brendan travelled to New York and Washington, marking his first stint in the United States with Grad to Export. We caught up with him to understand how his journey has been from onboarding to landing in ‘The Big Apple’.

New York With Invest NI's Graduate To Export

“When I arrived at Datactics in March 2021, I only had a limited knowledge of data management practices and how to apply them in real-life situations. My first few months at the company offered a soft landing into a career in tech and data; through training I was able to gain a thorough understanding of the Datactics platform and an appreciation of why data quality is such an integral piece of the data management puzzle.

These actions have been supplemented by the work we are doing with Ulster University and the Institute of Export, which has helped us to develop a robust market research and entry strategy to gain a better understanding of the North American market, prior to travel.

In September 2021, I attended my first trade show with Datactics and the Graduate to Export scheme, as we made the short trip to London for ‘Big Data LDN’. This was an excellent opportunity to further expand my understanding of how ‘Big Data’ is being used to develop insights and influence decision-making, and further appreciate how the Datactics solution can offer a remedy to poor quality data.

Big Data London Graduate To Export Brendan McCarthy

Attending trade shows, speaking to industry leaders and working alongside my exceptionally talented colleagues has helped me to develop the skills and knowledge needed to tackle the North American market; something that seemed a daunting prospect on day one of my new job in the tech industry.

With the help of Invest NI, we have found success in generating new business opportunities virtually from the comfort of our Belfast headquarters; however once travel restrictions were lifted, live events resumed and businesses returned to offices, we were very excited to make the trip across the Atlantic and be present in-market.

Naturally, the week of St. Patrick’s Day offered an excellent opportunity to do just that. Therefore, last week I made my first trip to the U.S. with Datactics as I travelled to New York to meet partners and attend a number of events which were happening in the city.

Bank of Ireland kindly invited me to visit their brand new ‘NYC Hub’. Consisting of an innovative co-working space, the Hub will support Irish start-ups and scale-ups as they seek to build their business and enter the U.S. market. This is a space that Datactics will look to utilise as we continue to grow the business in New York.

I then attended the bank’s St. Patrick’s Day celebrations in the exquisite Gotham Hall in Manhattan; an evening of networking with Bank of Ireland staff and clients, the Irish-American diaspora and other FinTechs in New York – special mention to the folks from DailyPay!

New York Bank of Ireland With Invest NI's Graduate To Export Brendan McCarthy

It was also great to spend some time in Washington, as I travelled to D.C. to attend the RegTech22 Data Summit with Data Coalition. We heard from industry leaders such Tammy Roust and Kirsten Dalboe as they spoke about how government and federal bodies in the U.S. can use RegTech solutions to improve customer experience with financial regulators, ESG reporting and transparency in rulemaking. This was another great opportunity to network and learn from experts in the world of data innovation in public sector.

The trip was concluded with a trip to the world famous Dead Rabbit NYC in the Financial District in Manhattan. This was a particularly pertinent experience for me, as someone with a long history in the hospitality sector and a passion for Irish innovation; this establishment is a perfect example of Irish talent exceeding expectations and winning accolades on a global stage, something we are aspiring to do too at Datactics.

Over the next six months, I will look to travel as often as possible to the U.S. and Canada, as Datactics attempts to establish a full-time presence in the region. Graduate to Export has been a wonderful initiative so far, and with the additional help and support from Invest Northern Ireland, we are confident that we can achieve this in the short-term.”

And for more from Datactics, find us on LinkedinTwitter, or Facebook.

The post New York With Invest NI’s Graduate To Export appeared first on Datactics.

]]>
Gartner Blog 3 – Key Insights and Takeaways https://www.datactics.com/blog/marketing-insights/gartner-blog-3-key-insights-and-takeaways/ Tue, 25 Jan 2022 14:57:48 +0000 https://www.datactics.com/?p=17824 The previous two editions of this blog series provided an overview of the Gartner Magic Quadrant from the perspective of someone relatively new to the world of data management, defining what it really means for a scaling business like Datactics to be recognised by Gartner.

The post Gartner Blog 3 – Key Insights and Takeaways appeared first on Datactics.

]]>

The previous two editions of this blog series provided an overview of the Gartner Magic Quadrant from the perspective of someone relatively new to the world of data management, defining what it really means for a scaling business like Datactics to be recognised by Gartner, as well as drawing attention to the core strengths of the Datactics product, as highlighted by Gartner analysts. With this knowledge, this blog will focus on the key insights we derived from this research, also highlighting market trends from the buyer and seller side. 

The data quality solutions market continues to mature at a rapid pace, with vendors from across the space innovating their offerings by making more impactful use of metadata and AI to solve customer problems and cater for increasingly complex use cases. 

Data Quality had traditionally been mandated to adhere to regulatory compliance and governance requirements and reduce operational risks and costs. However, as referenced in Gartner’s research, senior executives in businesses across the globe are recognising the necessity for Data Quality when amplifying analytics for more accurate insights and data driven decision-making.  

As per Gartner surveys, by the year 2023 it is estimated that over 60% of organisations will leverage machine-learning enabled data quality technology to increase automation of tasks and provide accurate recommendations, significantly reducing bottlenecks and manual effort often associated with tasks for Data Quality improvement. Additionally, by 2024, over 50% of businesses will implement modern Data Quality solutions to better support enterprise-wide digital transformation and business initiatives. Firms are engaging with vendors from the Magic Quadrant to gain competitive advantage and ultimately achieve their goals. 

One of the key trends taken from this year’s Magic Quadrant is the shift towards Self-Service Data Quality. The era of requiring heavy programming and IT resources to perform Data Quality tasks is changing, and no-code platforms will continue to rise due to their accessibility and usability. Datactics were the only vendor in this year’s Magic Quadrant accredited with this feature as a key strength, as they continue to champion the movement of no-code functionality in Data Quality. Additionally, firms are seeking to centralise Data Quality controls and improve interoperability by simplifying the ease of integration with adjacent software tools such as MDM, metadata and data governance.  

This research indicates that to fulfil the market demand for simplified data quality management, despite the increasingly complicated data landscape, vendors must offer a product that goes beyond simply fixing data errors, to helps clients actively manage their data right across the enterprise.  

If you would like to open a conversation about any of the topics discussed in the previous three blog articles, feel free to reach out to me on LinkedIn or send me an email at brendan.mccarthy@datactics.com. 

The post Gartner Blog 3 – Key Insights and Takeaways appeared first on Datactics.

]]>
How did Datactics rank in the Gartner Magic Quadrant? https://www.datactics.com/blog/marketing-insights/gartner-how-did-datactics-rank-in-the-magic-quadrant/ Mon, 20 Dec 2021 12:15:12 +0000 https://www.datactics.com/?p=17488 In the previous edition of this series, we explored the Gartner Magic Quadrant from the perspective of a newcomer to the data management industry

The post How did Datactics rank in the Gartner Magic Quadrant? appeared first on Datactics.

]]>

In the previous edition of this series, we explored the Gartner Magic Quadrant from the perspective of a newcomer to the data management industry. We defined what the Magic Quadrant really is, what it means for an aspiring scale-up like Datactics to be recognised by Gartner, as well as highlighting the critical capabilities and criteria that analysts use to qualify and position vendors in the quadrant. This blog will examine the specific feedback that Gartner analysts provided for Datactics, highlighting the core strengths of the platform and the reasons why we were recognised as a Niche Player in the Data Quality Solutions market. 

Gartner assert that augmented data quality powered by metadata and AI, is a key dynamic driving the data quality solutions market. This was recognised as one of the key strengths of the Datactics platform; our AI Server contains a diverse range of AI functionalities, including prebuilt models for entity resolution, ML-based matching and outlier detection. These innovations significantly reduce manual effort and increase accuracy of results, whilst maintaining transparency and some degree of human intervention. 

One of the compelling differences between Datactics and other vendors in the quadrant, as detailed in the Gartner Peer Insights forum, is our platform’s ease of use and implementation. Reviewers praised Datactics for their sophisticated and user-friendly workflow which requires minimal consulting services to implement and configure. It is a no-code platform that requires no programming experience to become proficient in rule development and management, reducing reliance on IT resources. Analysts also noted the ability of the Datactics platform to seamlessly integrate with third-party services for quick and easy connection. 

In conjunction with this, Gartner analysts noted Datactics’ Self-Service approach to data quality. Our platform addresses the increasing industry demand for business-led data management initiatives, accommodating non-technical users to become skilled and accomplished with the tool. Datactics was the only vendor in the quadrant recognised for this attribute, exemplifying why we are displacing industry giants in some environments.  

Additionally, Gartner also commented on Datactics’ current rate of market visibility, growth and vertical focus. Datactics is scaling rapidly, having reached 60 staff in 2021 and with a long-term goal to be the data quality tool of choice across a range of industries and geographical locations. In 2022, we expect to have feet on the ground in the APAC and North American markets and continue to have conversations with sectors outside of financial services and government. 

The final instalment of this series will focus on some of the key findings and insights that we took from the Magic Quadrant, as well as highlighting key trends that Gartner have predicted for the future of the data quality solutions market. 

Follow up with Brendan on LinkedIn here.

The post How did Datactics rank in the Gartner Magic Quadrant? appeared first on Datactics.

]]>
Gartner Magic Quadrant Blog 1 – What is the Gartner Magic Quadrant? https://www.datactics.com/blog/marketing-insights/gartner-magic-quadrant-blog-1-what-is-the-gartner-magic-quadrant/ Fri, 10 Dec 2021 09:48:36 +0000 https://www.datactics.com/?p=17290 Last month, we at Datactics were delighted to announce that we had been recognised in this year’s edition of the Gartner Magic Quadrant for ‘Data Quality Solutions’. This is a remarkable achievement for everyone associated with the company, as we make our first appearance on the quadrant, gaining industry recognition for our sophisticated technology and […]

The post Gartner Magic Quadrant Blog 1 – What is the Gartner Magic Quadrant? appeared first on Datactics.

]]>
gartner, what is the magic quadrant

Last month, we at Datactics were delighted to announce that we had been recognised in this year’s edition of the Gartner Magic Quadrant for ‘Data Quality Solutions’.

This is a remarkable achievement for everyone associated with the company, as we make our first appearance on the quadrant, gaining industry recognition for our sophisticated technology and acknowledgement for the people who have helped drive Datactics to the fore of the market. Datactics continues to punch above its weight and thrive in the role of the underdog, challenging industry giants and displacing them in certain environments.

But what really is ‘The Magic Quadrant’?

To answer this question, we turned to Brendan McCarthy, to give a view of someone relatively new to the data management ecosystem and the part that technology analysts such as Gartner plays in influencing buying decisions and technology development.

For those who are unfamiliar, the Gartner Magic Quadrant provides a graphical depiction of different types of technology providers and their position in fast-growing markets. Gartner research identifies and analyses the most relevant providers based on specific criteria that analysts believe to be crucial for inclusion.

For ‘Data Quality Solutions’, Gartner has highlighted the importance of features such as; the delivery of core data quality functions for profiling, cleansing and matching data, supporting multiple data domains and use cases and owning a geographically diverse customer base.

Between ten to fifteen of the most prominent vendors in each given market are then selected and judged on product capabilities, diversification of services and market presence. Based on this criteria, selected firms are then allocated into four distinct categories: Leaders, Visionaries, Niche Players and Challengers.

Datactics represented one of the ‘Niche Players’ in this edition of the quadrant, due to our specialism in pure-play data quality and matching, and the value we add to clients in specific industries and verticals. As we continue to innovate our product and scale internationally, we expect Datactics to appear in the ‘Challenger’ and ‘Leader’ segments of the quadrant over the coming years.

The next installment in this series will focus on specific feedback that we received from Gartner, showcasing the core competencies of the Datactics platform and the key strengths that our users have highlighted.

Follow up with Brendan on LinkedIn here.

The post Gartner Magic Quadrant Blog 1 – What is the Gartner Magic Quadrant? appeared first on Datactics.

]]>