You have been temporarily restricted. Please be more thoughtful when adding content for your portfolio. Your portfolio and digital media kit and should be reflective of the professional image you wish to convey. Accounts may be temporarily restricted if we receive reports of spamming or if the system detects excessive entries.
Membership
Publish your original ideas on the Thinkers360 platform!
This feature is available for Pro and Pro-Plus Members Only.
Speaker Bureau functionality whereby individuals can be featured speakers within our Speaker Bureau service and enterprises can find and work with speakers.
This feature is available for Pro, Pro-Plus, Premium and Enterprise Members Only.
Highlight your featured products and services within our company directory for enhanced visibility to active B2B buyers worldwide. This feature is available for Pro, Pro Plus, Premium and Enterprise Members Only.
Contribute to the Thinkers360 Member Blog and have your thought leadership featured on our web site, newsletter and social channels. Reach our opt-in B2B thought leader community and influencer marketplace with over 100M followers on social media combined!
You’ve reached your daily limit for entering quotes. Please only add personally-authored content which is reflective of your digital media kit and thought leadership portfolio.
Thinkers360 Content Library
For full access to the Thinkers360 content library, please join ourContent Planor become a contributor by posting your own personally-authored content into the system viaAdd PublicationorImport Publication.
Dashboard
Unlock your personalized dashboard including metrics for your member blogs and press releases as well as all the features and benefits of our member plans!
Interested in getting your own thought leader profile? Get Started Today.
William McKnight
President at McKnight Consulting Group
Plano, United States
William McKnight is an internationally recognized authority in information management. His consulting work has included many of the Global 2000 and numerous midmarket companies. His clients have reaped tremendous ROI and turned data into a real corporate asset. Many have gone public with their success stories.
William is the #1 global influencer in master data management, #1 in data warehousing, #3 in data management, #7 in information management and #14 in information architecture.
He is president of McKnight Consulting Group, which provides clients with action plans, architectures, strategies, complete programs and vendor-neutral tool selection to manage information. MCG is #1001 on the 2018 Inc. 5000 list of the fastest-growing companies in the US and #743 on the 2017 list.
He is the author of the books "Integrating Hadoop", "Information Management: Strategies for Gaining a Competitive Advantage with Data"? and "90 Days to Success in Consulting".
William teaches Data Platforms, Data Maturity, NoSQL, Graph Databases, Business Intelligence, Data Quality, Project Management, Data Governance, Data Architecture, Data Modeling, Data Integration, Data Return on Investment, Agile Methodology, Big Data, Organizational Change Management and Master Data Management for the Data Warehousing Institute (since 1998) and other events globally. He is a frequent international keynote speaker.
William has hundreds of articles and 50+ white papers in publication and is a prolific sponsored blogger.
An Ernst&Young Entrepreneur of the Year Finalist and frequent best practices judge, William is a former Fortune 50 technology executive and database engineer. William has taught at Santa Clara University, UC-Berkeley and UC-Santa Cruz.
He has consulted in 14 countries.
Current Services Offered, Presentation Calendar, White Papers, Articles and Press Quotes can be found at mcknightcg .com.
Cloud Data Warehouse vs. Cloud Data Lakehouse: A Snowflake vs. Starburst TCO and Performance Comparison
GigaOm
September 03, 2023
Recently, several architectural patterns have emerged that decentralize most components of the enterprise analytics architecture. Data lakes are a large part of that advancement.
A field test was devised to determine the differences between two popular enterprise data architectural patterns: a modern cloud data warehouse based on a Snowflake architecture and a modern data lakehouse with a Starburst-based architecture. The test was created with multiple components to determine the differences in performance and capability, as well as the amount of time and effort required to migrate to these systems from a legacy environment.
Edge Bare Metal Benchmark: Lumen vs AWS: An Exploration of Edge Cloud Services: From Bare Metal to Private Cloud
GigaOm
August 08, 2023
This report focuses on the latency of two top bare-metal server providers: Lumen Edge Cloud Services (Lumen) and Amazon Web Services (AWS). For our testing, we defined a high-performance model, mixing applications that can process 1,000 transactions per second and demand a maximum latency of 5ms or less, while the NGINX reverse proxy handles HTTP requests.
Actian Beats Snowflake and BigQuery in GigaOm TPC-H Benchmark Test
Actian
July 07, 2023
In the benchmark, the Actian Data Platform outperformed both Snowflake and BigQuery in 20 of the 22 queries, clearly illustrating Actian’s powerful decision support capabilities. Leveraging decades of data management experience, the Actian platform provides data warehouse technology that uses in-memory computing along with optimized data storage, vector processing, and query execution that exploits powerful CPU features. These capabilities significantly improve the speed and efficiency of real-time analytics.
Transaction Processing & Price-Performance Testing: Distributed SQL Databases Using Cloud Managed Services Evaluation: Azure Cosmos DB for PostgreSQL, CockroachDB Dedicated & YugabyteDB Managed
GigaOm
July 07, 2023
To evaluate distributed databases, we conducted a Transactional Field Test derived from the industry-standard TPC Benchmark C (TPC-C). We compared fully-managed as-a-service offerings of cloud-distributed databases:
Azure Cosmos DB for PostgreSQL
CockroachDB Dedicated
YugabyteDB Managed
Cloud Analytics Top Data base Performance Testing report: Exasol outstrips Snowflake and another major cloud data warehouse for performance and price-performance
Exasol
July 07, 2023
You should never have to sacrifice budget control to improve the performance of your analytics database.
Exasol’s no-compromise analytics database takes care of this for you, delivering significant productivity gains, cost-savings and flexibility, without any trade-offs. Don’t just take our word for it, read the latest report conducted by McKnight Consulting Group, on Cloud Analytics Database Performance testing against Exasol, Snowflake, and another major cloud data warehouse (April 2023).
The report proves that Exasol wins on:
Query response times:
3.6x faster than Snowflake
20x faster than the unnamed data warehouse
Price-performance:
Snowflake is nearly 17x more expensive than Exasol
Unnamed data warehouse is over 20x more expensive
Digital Analytics and Measurement Tools Evaluation: Google Analytics 4 and Snowplow
Snowplow
July 02, 2023
We performed a field test to assess how Snowplow compares to GA4 for a retail organization that realizes the mandate for digital analytics. To make the test as fair as possible, we used both Google Analytics and Snowplow in vanilla e-commerce implementations. This means that for both solutions, we used the out-of-the-box e-commerce events.
Total cost of ownership: NATS vs Kafka
Synadia
May 05, 2023
A total cost of ownership (TCO) comparison against several real-world, and comparable workloads for NATS and Kafka for streaming. Users want to know if the “simplicity” is a trade-off vs functionality or performance.
Procurement Efficiency with the Microsoft Commercial Marketplace
Import from wordpress feed
March 27, 2023
The rapidly growing adoption of public clouds by IT organizations is frequently motivated by a desire to be more adaptable, agile, and
The post Procurement Efficiency with the Microsoft Commercial Marketplace appeared first on Gigaom.
New Microsoft Teams Performance Benchmark
Import from wordpress feed
March 27, 2023
This GigaOm Benchmark report was commissioned by Microsoft. Microsoft Teams (Teams) is a collaboration platform that combines workplace chat, video meetings, file
The post New Microsoft Teams Performance Benchmark appeared first on Gigaom.
SQL Transaction Processing and Analytic Performance Price-Performance Testing: Microsoft SQL Server Evaluation: Azure vs. Amazon Web Services
GigaOm
March 03, 2023
This report outlines the results from two Field Tests (one transactional and the other analytic) derived from the industry-standard TPC Benchmark E (TPC-E) and TPC Benchmark H (TPC-H). The tests compare two IaaS cloud database offerings running Red Hat Enterprise Linux (RHEL), configured as follows:
RHEL 8.6 with Microsoft SQL Server 2022 Enterprise on r6idn.8xlarge Amazon Web Services (AWS) Elastic Cloud Compute (EC2) instances with gp3 volumes.
RHEL 8.6 with Microsoft SQL Server 2022 Enterprise on a E32bdsv5 Azure Virtual Machine (VM) with Premium SSD v2 disks.
Security Information and Event Management: A MITRE ATT&CK Framework Competitive Evaluation
Import from wordpress feed
January 25, 2023
Security information and event management (SIEM) technology supports threat detection, compliance, and security incident management through the collection and analysis (near real-time
The post Security Information and Event Management: A MITRE ATT&CK Framework Competitive Evaluation appeared fi
SQL Transaction Processing and Analytic Performance Price-Performance Testing
Import from wordpress feed
January 25, 2023
The fundamental underpinning of any organization is its transactions. They must be done well, with integrity and performance. Not only has transaction
The post SQL Transaction Processing and Analytic Performance Price-Performance Testing appeared first on Gigaom.
ABAC vs RBAC: The Advantage of Attribute-Based Access Control over Role-Based Access Control
GigaOm
January 01, 2023
Data security has become an undeniable part of the technology stack for modern applications. No longer an afterthought, protecting application assets—namely data—against cybercriminal activities, insider threats, and basic human negligence needs to happen early and often during the application development cycle and beyond. This benchmark report captures the number of policy changes required to manage ever-evolving data security policies seen in a modern data-driven enterprise. The more policy changes required, the more likely a required change will not take place or an error is made when implementing the change. With this study, we show the impacts of data security governance policy management.
CrowdStrike Falcon LogScale Benchmark Report: Log Management and Analytics Platform
GigaOm
January 01, 2023
Real-time observability and enterprise systems monitoring have become critical functions in information technology organizations globally. As organizations continue to digitize and automate key functions, they are introducing more complex systems, hypervisors, virtual machines, Kubernetes, devices, and applications—all of which are generating more log and event data. While the amount of usable log data is growing, there is not an attendant growth in the tools, skilled professionals, and other resources to capture, manage, and analyze this complexity.
Advantages of DataStax Astra Streaming for JMS Applications
Import from wordpress feed
December 22, 2022
Competitive markets demand rapid, well-informed decision-making to succeed. In response, enterprises are building fast and scalable data infrastructures to fuel time-sensitive decisions,
The post Advantages of DataStax Astra Streaming for JMS Applications appeared first on Gigaom.
Confluent Cloud: Fully Managed Kafka Streaming, An Ease-of-Use Comparison
GigaOm
December 13, 2022
This report focuses on real-time data and how autonomous systems can be fed at scale reliably. To shed light on this challenge, we assess the ease of use of a fully managed Kafka platform—Confluent Cloud—and a self-managed open-source Apache Kafka solution.
Cloud Parallel File Systems
GigaOm
December 13, 2022
We benchmarked the usability, effort, and performance of the WEKA Data Platform against Amazon FSx for Lustre on AWS. In this hands-on benchmark, we found that WEKA provided comparable or superior usability and outperformed FSx for Lustre at similar capacities by up to 300% or more. On some of our tests, WekaFS IO latency was less than 30% that of FSx for Lustre. Our usability tests also found WEKA to be a mature and easily deployed and operated solution in AWS specifically.
Dealing with Data System Complexity in Your Applications
GigaOm
December 13, 2022
In conventional information architectures, enterprise needs requires two different database technologies: online transactional processing (OLTP) database management systems (DBMS) to handle transactional workloads and online analytical processing (OLAP) DBMS to perform analytics and reporting. Data types also drive multiple technologies since many databases specialize in types like time series, geospatial, graph, JSON, etc. If there is a single database that can be used to avoid the overhead, it is worthwhile to look into that database for complete application management.
Advantages of DataStax Astra Streaming for JMS Applications
GigaOm
December 13, 2022
Competitive markets demand rapid, well-informed decision-making to succeed. In response, enterprises are building fast and scalable data infrastructures to fuel time-sensitive decisions, provide rich customer experiences enable better business efficiencies, and gain a competitive edge. In our comparative study, we used the Starlight for JMS feature included in DataStax Astra Streaming along with self-managed open-source Apache ActiveMQ Artemis JMS instances. We found several notable differences and benefits for modernizing a JMS-based data streaming stack.
API and Microservices Management Benchmark: Product Evaluation: Kong Enterprise, Apigee X and MuleSoft Anypoint
GigaOm
December 13, 2022
Application programming interfaces, or APIs, are a ubiquitous method and de facto standard of communication among modern information technologies. The information ecosystems within large companies and complex organizations encompass a vast array of applications and systems, many of which have turned to APIs for exchanging data as the glue that holds these heterogeneous artifacts together. APIs have begun to replace older, more cumbersome methods of information sharing with lightweight, loosely-coupled microservices. This change allows organizations to knit together disparate systems and applications without creating technical debt from tight coupling with custom code or proprietary, unwieldy vendor tools.
This report reveals the results of performance testing we completed on these API and microservices management platforms: Kong Enterprise, Google Cloud Apigee X, and MuleSoft Anypoint Flex Gateway.
Transactional and Analytical Workloads: How Transactional and Analytical Performance Impacts the TCO of Cloud Databases
GigaOm
December 13, 2022
Competitive data-driven organizations rely on data-intensive applications to win in the digital service economy. These applications require a robust data tier that can handle the diverse workloads demands of both transactional and analytical processing while serving an interactive, immersive customer experience. The resulting database workloads demand low-latency responses, fast streaming data ingestion, complex analytic queries, high concurrency, and large data volumes.
This report outlines the results from a Field Test derived from three industry standard benchmarks—TPC Benchmark H (TPC-H), TPC Benchmark DS (TPC-DS), and TPC Benchmark C (TPC-C)—to compare SingleStoreDB, Amazon Redshift, and Snowflake.
ABAC vs RBAC: The Advantage of Attribute-Based Access Control over Role-Based Access Control
Import from wordpress feed
December 07, 2022
Data security has become an undeniable part of the technology stack for modern applications. No longer an afterthought, protecting application assets—namely data—against
The post ABAC vs RBAC: The Advantage of Attribute-Based Access Control over Role-Based Access Control appeared first on Gigaom
CrowdStrike Falcon LogScale Benchmark Report
Import from wordpress feed
December 06, 2022
Real-time observability and enterprise systems monitoring have become critical functions in information technology organizations globally. As organizations continue to digitize and automate
The post CrowdStrike Falcon LogScale Benchmark Report appeared first on Gigaom.
Cloud Analytics Platform Total Cost of Ownership v2.0
GigaOm
December 02, 2022
Organizations today need a broad set of enterprise data cloud services with key data functionality to modernize applications and utilize machine learning. We decided to take four leading platforms for machine learning under analysis. We have learned that the cloud analytic framework selected for an enterprise and an enterprise project matters in terms of cost.
Log and Telemetry Analytics Performance Benchmark
GigaOm
December 01, 2022
This report focuses on the performance of cloud-enabled, enterprise-ready, popular log analytical platforms Microsoft Azure Data Explorer (part of Azure Synapse Analytics), Google BigQuery, and Snowflake. Due to cost limitations with Elasticsearch and AWS OpenSearch, we could not run our tests on Elasticsearch. Microsoft invited GigaOm to measure the performance of the Azure Data Explorer engine and compare it with its leading competitors in the log analytics space. The tests we designed intend to simulate a set of basic scenarios to answer fundamental business questions that an organization from nearly any industry might encounter in their log analytics.
In this report, we tested complex workloads with a volume of 100TB of data and concurrency of 1 and 50 concurrent users. The testing was conducted using comparable hardware configurations on Microsoft Azure and Google Cloud.
What to Know About the Differences in Analytic Architecture Patterns
Information Week
June 05, 2023
Enterprise data architects should take into consideration the data lakehouse, data mesh, and data fabric when constructing the analytic environment today. Here are the main ideas these patterns offer.
High Performance Application Security Testing
Gigaom
November 01, 2021
Data, web, and application security have evolved dramatically over the past few years. Just as new threats abound, the architecture of applications—how we build and deploy them—has changed. We’ve traded monolithic applications for microservices running in containers and communicating via application programming interfaces (APIs)—and all of it deployed through automated continuous integration/continuous deployment (CI/CD) pipelines. The frameworks we have established to build and deploy applications are optimized for time to market—yet security remains of utmost importance.
Enterprise Analytic Solutions 2021 v4.0
Gigaom
September 03, 2021
Enterprises from every industry and at every scale are working to leverage data to achieve their strategic objectives—whether those are to become more profitable, effective, risk tolerant, prepared, sustainable, and/or adaptable in an ever-changing world. An enterprise’s data maturity must grow at pace with the business and its needs to achieve agility and resilience—otherwise it will be hamstrung or tripped up by limited data capabilities. A mature analytic data management strategy includes the ability to adapt.
NoSQL databases have seen a lot of growth in the past several years. The realities of the pandemic have exposed how important it is to have a reliable, agile technology infrastructure in place with proven software like the open source Apache Cassandra — which is crucial for business continuity, low latency, and being able to effectively support increased data traffic. Demand for data at all companies will only increase in the coming years, and the reality is, much of this is unexposed cost that may create an unsustainable burden for many organizations.
Integrating Hadoop leverages the discipline of data integration and applies it to the Hadoop open-source software framework for storing data on clusters of commodity hardware. It is packed with the need-to-know for managers, architects, designers, and developers responsible for populating Hadoop in the enterprise, allowing you to harness big data and do it in such a way that the solution:
Complies with (and even extends) enterprise standards
Integrates seamlessly with the existing information infrastructure
Fills a critical role within enterprise architecture.
Integrating Hadoop covers the gamut of the setup, architecture and possibilities for Hadoop in the organization, including:
Supporting an enterprise information strategy
Organizing for a successful Hadoop rollout
Loading and extracting of data in Hadoop
Managing Hadoop data once it's in the cluster
Utilizing Spark, streaming data, and master data in Hadoop processes - examples are provided to reinforce concepts.
Information Management: Strategies for Gaining a Competitive Advantage with Data (The Savvy Manager's Guides)
Morgan Kaufman
January 09, 2014
Information Management: Gaining a Competitive Advantage with Data is about making smart decisions to make the most of company information. Expert author William McKnight develops the value proposition for information in the enterprise and succinctly outlines the numerous forms of data storage. Information Management will enlighten you, challenge your preconceived notions, and help activate information in the enterprise. Get the big picture on managing data so that your team can make smart decisions by understanding how everything from workload allocation to data stores fits together.
McKnight Associates, Inc.
Conversion Services International
May 09, 2015
William founded and grew McKnight Associates, Inc. during 1998-2005 to placement in the Inc. 500, the Dallas 100 (twice) and the Collin (county) 60. He sold the company to a public firm in 2005.
Ernst&Young Southwest Entrepreneur of the Year Finalist
Ernst & Young
May 01, 2015
Ernst&Young Southwest Entrepreneur of the Year Finalist is an award that recognizes the accomplishments of entrepreneurs in the Southwest region. The award honors businesses that have achieved remarkable success by demonstrating strategic vision, financial performance, innovation, and community and industry impact.
The Evolution of the Data Platform and What It Means to Data Warehousing
Astera
May 25, 2021
Business landscapes are in hyperdrive. Data volumes are exploding, modern sources are rapidly taking over old legacy systems, and organizations are continually seeking advanced analytics solutions to deliver unparalleled customer experiences and tap into new revenue streams. Amidst all the data chaos, you need an agile, well-knitted, responsive data warehouse architecture that provides information clarity, tackles complexity, and delivers accurate, trusted insights at lightspeed for game-changing decision making.
Opening a new dimension of agility, automation, and simplicity, Astera Software is introducing a next-gen data warehouse automation solution that allows businesses to go from data ingestion to BI & analytics in a matter of hours, all through a single platform.
Whether you want to build a new data warehouse or modernize a legacy architecture, we are bringing a solution that holds the key to unlock the agility and efficiency needed to drive your initiative.
Information - The Next Natural Resource | William McKnight | TEDxUTD
YouTube
May 26, 2015
Big Data is ubiquitous and continuously growing. William reflects on the trends of Big Data and demonstrates that its not the question of how much information we have but how we use that information wisely.
William functions as Strategist, Lead Enterprise Information Architect, and Program Manager for sites worldwide utilizing the disciplines of data warehousing, master data management, business intelligence and big data. Many of his clients have gone public with their success stories.
He is author of the book “Information Management: Strategies for Gaining a Competitive Advantage with Data.”
189: From Data Chaos to Data Maturity – McKnight Consulting Group
Spotify
April 04, 2023
William McKnight, President at McKnight Consulting Group discusses the importance of data maturity in achieving digital transformation and the crucial role of data architects in helping organizations become data-driven.
Gary Fowler and William McKnight: How To Combat Infobesity
YouTube
March 28, 2023
GSD Presents
How To Combat Infobesity with William McKnight
Guest:
William McKnight, Founder & President at McKnight Consulting Group
William McKnight has been recognized as the #1 global influencer in big data, cloud, and data center by Thinkers 360 and the #1 global influencer in master data management by Onalytica in 2022.
He is the Founder and President of McKnight Consulting Group, which advises many of the Global 2000 companies, including Pfizer, Verizon, UnitedHealth Group, Dell, Oracle, and Scotiabank, on ways to faster grow their businesses with big data.
They've been recognized as one of the Inc. 5000 fastest-growing private companies in the US two times and their clients in 14+ countries have reaped tremendous ROI and turned data into a real corporate asset.
My Career in Data Episode 22: William McKnight, President, McKnight Consulting Group
YouTube
March 01, 2023
Welcome back to a new episode of My Career in Data – a DATAVERSITY Talks podcast where we sit down with professionals to discuss how they have built their careers around data.
This week we're happy to chatting with William McKnight, the President and Founder of McKnight Consulting Group, about the importance of commitment and how his athletic pursuits have informed his perspective.
CyberSide Chats An Interview With William McKnight
Cyberside Chats
June 14, 2023
While we are all appropriately focused on #AI, there are a number of issues that must be considered to maximize the value of AI. In this podcast, I spoke to William McKnight, of McKnight Consulting Group, about how companies should think about and organize #data to begin to maximize the value of AI.
Welcome back to a new episode of My Career in Data – a DATAVERSITY Talks podcast where we sit down with professionals to discuss how they have built their careers around data.
This week we're happy to chatting with William McKnight, the President and Founder of McKnight Consulting Group, about the importance of commitment and how his athletic pursuits have informed his perspective.
Maximizing the Power of Enterprise Data
Terminal Value
February 08, 2023
Doug and William spoke about maximizing the value of enterprise data is essential for large companies to make better decisions, it involves organizing data from different systems and extracting insights from it. This process can be costly, requiring specialized tools and skilled personnel, but is necessary.
Business owners should pay attention to the analytics to gain insights and inform strategic decisions, and having the right strategy and vision is important to know what to look for and ask.
Data Architecture for the CEO
Data Leadership for Everyone
June 17, 2021
William has advised many of the world’s best-known organizations. His strategies form the information management plan for leading companies in various industries. He is a prolific author and a popular keynote speaker and trainer. He has performed dozens of benchmarks on leading database, data lake, streaming and data integration products. William is the #1 global influencer in data warehousing and master data management and he leads McKnight Consulting Group, which has twice placed on the Inc. 5000 list.
President of the Data Warehousing Institute Dallas chapter
TDWI
January 01, 1016
Represented the organization in my local community and promoted its values and mission. As President, I worked with a team of volunteers to organize and lead local events such as seminars, meetings, and social networking opportunities. Additionally, I was responsible for promoting TDWI’s educational activities, such as certification programs, and strengthening the local chapter by networking and developing relationships with vendors and other organizations.
Architecture, Products, and Total Cost of Ownership of the Leading Machine Learning Stacks
Dataversity
March 09, 2023
Organizations today need a broad set of enterprise data cloud services with key data functionality to modernize applications and utilize machine learning. They need a comprehensive platform designed to address multi-faceted needs by offering multi-function data management and analytics to solve the enterprise’s most pressing data and analytic challenges in a streamlined fashion.
In this research-based session, I’ll discuss what the components are in multiple modern enterprise analytics stacks (i.e., dedicated compute, storage, data integration, streaming, etc.) and focus on total cost of ownership.
A complete machine learning infrastructure cost for the first modern use case at a midsize to large enterprise will be anywhere from $3 million to $22 million. Get this data point as you take the next steps on your journey into the highest spend and return item for most companies in the next several years.
BUSINESS IS BUSINESS! AND IT’S BETTER WITH ANALYTICS
DM Radio
February 16, 2023
This webinar will explore how analytics can help businesses make better decisions and improve their bottom line. We will discuss how analytics can be used to identify trends, uncover opportunities, and make more informed decisions. We will also discuss how analytics can be used to measure performance, identify areas of improvement, and create strategies for success. Finally, we will discuss how analytics can be used to create a competitive advantage and drive growth. Attendees will leave with a better understanding of how analytics can be used to improve their business.
Showing ROI for Your Analytic Project
Dataversity
February 09, 2023
Analytics play a critical role in supporting strategic business initiatives. Despite the obvious value to analytic professionals of providing the analytics for these initiatives, many executives question the economic return of analytics as well as data lakes, machine learning, master data management, and the like.
Technology professionals need to calculate and present business value in terms business executives can understand. Unfortunately, most IT professionals lack the knowledge required to develop comprehensive cost-benefit analyses and return on investment (ROI) measurements.
This session provides a framework to help technology professionals research, measure, and present the economic value of a proposed or existing analytics initiative, no matter the form that the business benefit arises. The session will provide practical advice about how to calculate ROI and the formulas, and how to collect the necessary information.
23 things I’ve learned about data quality in 23 years of consulting by 2023
TechTarget
February 06, 2023
Data quality is a difficult concept to quantify while still being vital enough to sabotage any project, strategic initiative, or even an entire business. What's going on with the quality of your data? Is the data well-suited for all of its uses? Is your data everywhere – in operational databases, data warehouses, data lakes, master data management, etc.?
Information management expert William McKnight will go over 23 lessons he’s learned about data quality, frequently the hard way, in this presentation. Any problem with data quality, and quite possibly with data, that you may be experiencing is likely the result of someone failing to take one of these lessons to heart. Focusing on one or more of these could be the key to achieving understanding between parties and, ultimately, finding a solution to your data quality, and consequently data, problems.
In this presentation, you’ll discover up to 23 facts about data quality as well as fixes for the issues that the absence of data quality causes.
Analytics ROI Best Practices
Dataversity
January 18, 2023
Analytics plays a critical role in supporting strategic business initiatives. Despite the apparent value of providing the data infrastructure for these initiatives, many executives question the economic feasibility of business intelligence and analytics. This requires information professionals to calculate and present the business value in terms business executives can understand.
Unfortunately, most IT professionals lack the knowledge required to develop comprehensive cost-benefit analyses and return on investment (ROI) measurements.
This session provides a framework to help IT professionals research, measure, and present the economic value of a proposed or existing analytics initiative. The session will provide practical advice about how to calculate ROI, the formulas in use, and how to collect necessary information.
2023 Trends in Enterprise Analytics
Dataversity
January 12, 2023
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the enterprise mission will be executed and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business you’re in, you’re in the business of analytics.
The coming years will be full of big changes in enterprise analytics and data architecture. William will kick off the fifth year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
Data Architecture Best Practices for Advanced Analytics
Dataversity
January 10, 2023
Many organizations are immature when it comes to data and analytics use. The answer lies in delivering a greater level of insight from data, straight to the point of need.
There are so many Data Architecture best practices today, accumulated from years of practice. In this webinar, William will look at some Data Architecture best practices that he believes have emerged in the past two years and are not worked into many enterprise data programs yet. These are keepers and will be required to move towards, by one means or another, so it’s best to mindfully work them into the environment.
MLOps – Applying DevOps to Competitive Advantage
Dataversity
December 08, 2022
MLOps is a practice for collaboration between Data Science and operations to manage the production machine learning (ML) lifecycles. As an amalgamation of “machine learning” and “operations,” MLOps applies DevOps principles to ML delivery, enabling the delivery of ML-based innovation at scale to result in:
Faster time to market of ML-based solutions
More rapid rate of experimentation, driving innovation
Assurance of quality, trustworthiness, and ethical AI
MLOps is essential for scaling ML. Without it, enterprises risk struggling with costly overhead and stalled progress. Several vendors have emerged with offerings to support MLOps: the major offerings are Microsoft Azure ML and Google Vertex AI. We looked at these offerings from the perspective of enterprise features and time-to-value.
This session will be informative and helpful in uncovering some of the challenges and nuances of MLOps program development and platform selection.
Setting the Budget for the ML Stack for Analytics
BrightTalk
December 07, 2022
Organizations today need a broad set of enterprise data cloud services with key data functionality to modernize applications and utilize machine learning. They also need a platform designed to address multi-faceted needs by offering multi-function data management and analytics to solve the enterprise’s most pressing data and analytic challenges in a streamlined fashion. Above all, they need a worry-less experience with the architecture and its components.
A complete machine learning infrastructure cost for the first modern use case at a midsize to large enterprise will be anywhere from $2M to $14M. Get this data point as you take the next steps on your journey.
Using Automation to Build a Sustainable Data Warehouse
Astera
November 29, 2022
Join us in this webinar and watch industry experts share insights on building a sustainable data warehouse architecture to meet the data requirements of modern enterprises.
Assessing New Database Capabilities – Multi-Model
Dataversity
November 15, 2022
Today’s enterprises have an unprecedented variety of data store choices to meet the needs of the varied workloads of an enterprise because there is no one-size-fits-all when it comes to data stores. Putting in place data stores to support a modern enterprise that is now reliant on data can lead to confusion and chaos.
Enterprises have many needs for databases, including for cache, operational, data warehouse, master data, ERP, analytical, graph data, data lake, time series data, and numerous other specific needs.
While vendor offerings have exploded in recent years, in due time frameworks will integrate components into what amounts to, for practical purposes, a single offering for multiple workloads, perhaps even for the enterprise.
A multi-model database is a database that can store, manage, and query data in multiple models, such as relational, document-oriented, key-value, graph (triplestore), and column store.
An enterprise will find reduced overhead and other synergies from choosing a single vendor for these workloads.
This session will explore the multi-model option and some criteria that decision makers should evaluate when choosing a multi-model solution.
Graph Database Use Cases
Dataversity
October 20, 2022
Graph databases may be the unsung heroes of data platforms. They are poised to expand dramatically in the next few years as the nature of important analytics data expands dramatically into understanding. We live and work today in a highly connected world where individuals and their relationships brand perceptions, consumer behaviors, and many other business success factors. Where patterns are involved in relationships, it is imperative to understand them. Graph databases are the technology that is best-suited to determining and understanding data relationships.
This code-lite session will help you determine why, how, and where to apply graphs in your enterprise.
Measuring Data Quality Return on Investment
Enterprise Analytics Online
October 19, 2022
Data Quality is an elusive subject that can defy measurement and yet be critical enough to derail any project, strategic initiative, or even a company. The data layer of an organization is a critical component because it is so easy to ignore the quality of that data or to make overly optimistic assumptions about its efficacy. Having Data Quality as a focus is a business philosophy that aligns strategy, business culture, company information, and technology in order to manage data to the benefit of the enterprise. It is a competitive strategy. However, you can’t improve what you can’t measure. We need a means for measuring the quality of our data. Abstracting quality into a set of agreed-upon data rules and measuring the occurrences of quality violations provides the measurement in the methodology.
Assessing New Databases: Translytical Use Cases
Dataversity
October 13, 2022
Organizations run their day-in-and-day-out businesses with transactional applications and databases. On the other hand, organizations glean insights and make critical decisions using analytical databases and business intelligence tools.
The transactional workloads are relegated to database engines designed and tuned for transactional high throughput. Meanwhile, the big data generated by all the transactions require analytics platforms to load, store, and analyze volumes of data at high speed, providing timely insights to businesses.
Thus, in conventional information architectures, this requires two different database architectures and platforms: online transactional processing (OLTP) platforms to handle transactional workloads and online analytical processing (OLAP) engines to perform analytics and reporting.
Today, a particular focus and interest of operational analytics includes streaming data ingest and analysis in real time. Some refer to operational analytics as hybrid transaction/analytical processing (HTAP), translytical, or hybrid operational analytic processing (HOAP). We’ll address if this model is a way to create efficiencies in our environments.
Is Our Information Management Mature?
Dataversity
June 09, 2022
Maturity frameworks have varying levels of information management maturity. Each level corresponds to not only increased data maturity, but also increased organizational maturity and bottom-line ROI. There are recommended targets to achieve an effective information management program. The speaker’s maturity framework sequences the information management activities for your consideration. It is based on real client roadmaps. This webinar promises to offer a wealth of ideas for key quick wins to benefit the organization’s information management program.
Attendees can self-assess their current information management capabilities as we go through Data Strategy, organization, architecture, and technology, yielding an overall view of the current level of information management maturity.
This webinar provides a foundation for enhancing current data and analytic capabilities and updating the strategy and plans for achievement of improved information management maturity, aligned with major initiatives.
This is always a hot topic when the speaker gives it, so be sure to come plot your shop on the curve.
Methods of Organizational Change Management
Dataversity
September 23, 2021
The disparity between expecting change and managing it — the “change gap” — is growing at an unprecedented pace. This has put many information management shops into traction as they initiate large, complex projects needed to stay competitive.
Information management professionals and business leaders must concern themselves with the organization’s acceptance of these efforts. To be successful in achieving the larger enterprise goals, these initiatives must transform the organization. However, it takes more than wishful thinking to bridge the gap.
The complexities of engaging behavioral and enterprise transformation are too often underestimated at great peril because the “soft stuff” is truly hard.
What Is My Enterprise Data Maturity 2021
Dataversity
September 22, 2021
Maturity frameworks have varying levels of Data Management maturity. Each level corresponds to not only increased data maturity but also increased organizational maturity and bottom-line ROI. There are recommended targets to achieve an effective information management program. The speaker’s maturity framework sequences the information management activities for your consideration. It is based on real client roadmaps. This webinar promises to offer a wealth of ideas for key quick wins to benefit the organization’s information management program.
Attendees can self-assess their current information management capabilities as we go through Data Strategy, organization, architecture, and technology, yielding an overall view of the current level of information management maturity.
This webinar provides a foundation for enhancing current data and analytic capabilities and updating the strategy and plans for the achievement of improved information management maturity, aligned with major initiatives.
Using Data Platforms That Are Fit-For-Purpose
Dataversity
August 19, 2021
We must grow the data capabilities of our organization to fully deal with the many and varied forms of data. This cannot be accomplished without an intense focus on the many and growing technical bases that can be used to store, view, and manage data. There are many, now more than ever, that have merit in organizations today.
This session sorts out the valuable data stores, how they work, what workloads they are good for, and how to build the data foundation for a modern competitive enterprise.
2045: A World Shaped by Artificial Intelligence
Dataversity
August 09, 2021
How will technology and society change in the next 25 years? In this session I look forward to the next 25 years. The year 2045 may seem far away, but we already have predictions about the technological innovations prevalent in 2045. Hint: Artificial Intelligence will have a huge impact.
Platforming the Major Analytic Use Cases for Modern Engineering
Dataversity
May 13, 2021
We’ll describe some use cases as examples of a broad range of modern use cases that need a platform. We will describe some popular valid technology stacks that enterprises use in accomplishing these modern use cases of customer churn, predictive analytics, fraud detection, and supply chain management.
In many industries, to achieve top-line growth, it is imperative that companies get the most out of existing customer relationships. Customer churn use cases are about generating high levels of profitable customer satisfaction through the use of knowledge generated from corporate and external data to help drive a more positive customer experience (CX).
Many organizations are turning to predictive analytics to increase their bottom line and efficiency and, therefore, competitive advantage. It can make the difference between business success or failure.
Fraudulent activity detection is exponentially more effective when risk actions are taken immediately (i.e., stop the fraudulent transaction), instead of after the fact. Fast digestion of a wide network of risk exposures across the network is required in order to minimize adverse outcomes.
Supply chain leaders are under constant pressure to reduce overall supply chain management (SCM) costs while maintaining a flexible and diverse supplier ecosystem. They will leverage IoT, sensors, cameras, and blockchain. Major investments in advanced analytics, warehouse relocation, and automation, both in distribution centers and stores, will be essential for survival.
High Performance Cloud Data Warehouse Vendor Evaluation
GigaOm
April 09, 2021
This free 1-hour webinar brings GigaOm analyst William McKnight and special guest, Cloudera’s Bill Zhang, Director of Data Warehouse Product Management to discuss the intriguing results from an in-depth Analytic Field Test derived from the industry-standard TPC-DS benchmark to compare leading cloud data warehouse offerings: Amazon Redshift, Azure Synapse, Snowflake, Google BigQuery and Cloudera Data Warehouse.
Data-driven organizations rely on analytic databases to load, store, and analyze volumes of data at high-speed to derive timely insights. Data volumes within modern organization’s information ecosystems are rapidly expanding—placing significant demands on legacy architectures. Today, to fully harness their data to gain a competitive advantage, businesses need modern, scalable architectures and high levels of performance and reliability to provide timely analytical insights. Companies are attracted to fully-managed cloud services.
How To Reduce Your Total Cost of Ownership for Cassandra
DataStax
April 06, 2021
Everybody wants data without limits - infinite scale, zero-downtime and software velocity without going over budget. The weapon of choice is Apache Cassandra, but how do you control data costs and complexity without sacrificing performance? The answer to this is serverless Cassandra! No more nodes, no more servers, no more idle/wasted capacity.
Join us as GigaOm presents the results of their study that empirically validates how serverless Cassandra saved 76% on TCO compared with self-managed Cassandra. Learn:
How to model expected costs of Cassandra workloads
How serverless Cassandra reduces infrastructure and operational costs
What are the criteria and tradeoffs for migration decisions?
Modernize Data Warehousing – Beyond Performance
Vertica
April 06, 2021
Configuration, management, tuning and other tasks can take away from valuable time spent on business analytics. If a platform leads to coding workarounds, non-intuitive implementations and other problems, it can make a big impact on long-term resource usage and cost. A lot of enterprise analytics platform evaluations focus on query price-performance to the exclusion of other features that can have a huge impact on business value, and can cause major headaches if you don’t take them into consideration.
In this webinar, we’ll go beyond price-performance, and focus on everything else needed to modernize your data warehouse.
Location: Worldwide Date Available:
May 09th, 2019 Fees: Depends on scope
Submission Date:
May 09th, 2019 Service Type: Service Offered
There is a demand for information to be an asset that fuels organizational growth. Greater volumes of data are generated and service level expectations continue to rise. Tolerance for poor quality is lessened, and the underlying complexity and load on the systems is amplified. Discussions around data warehouses, data integration, big data, business intelligence, and data management occur more frequently. Probably you’ve implemented one or more of these projects but at the end of the day, you need to take your information management capabilities to the next level.
MCG’s 2- to 8-week Information Management Action Plan (IMAP) is designed for organizations that want to advance their analytical capabilities across the spectrum of information management. Information management comprises the disciplines of data warehousing, big data, business intelligence, master data management and data governance. These disciplines are incredibly interrelated. Operating them in silos will not lead to success.
MCG will analyze your needs and provide expert advice across the process, people, and technology that drive your organization and are so critical to its success. MCG has built over 40 Action Plans for our clients and contributed to business successes worldwide.
Join Thinkers360 for free! Are you a Reader/Writer, Thought Leader/Influencer (looking to increase your earnings), or an Enterprise User (looking to work with experts)?