All sessions are at  Massachusetts Institute of Technology  Tang Building (E51), MIT East Campus, 70 Memorial Drive, Cambridge, MA, USA 02139.

July 17, 2013
  • 7:30 am - 8:50 amContinental Breakfast and Registration - Ting Foyer
  • 9:00 am - 10:30 amPlenary Session - Wong AuditoriumOpening Remarks
    General Chair: Peter Anlyan
    Co-Chairs: Yang Lee, Stu Madnick, Willa Pickering
    Vendor Chair: Robert Lutton

    Welcome
    Deborah Nightingale, Director, MIT Sociotechnical Systems Research Center; Professor of the Practice of Aeronautics and Astronautics and Engineering Systems; Co-Director, MIT Lean Advancement Initiative

    2013 Symposium Co-Chairs
    Business & Finance Sector: Joe Maguire, Steve Sarsfield
    Health Care Sector: Allen Juris, Yang Lee, Michael Nix
    Public Sector: Annette Pence, Douglas Whall

    Awards Ceremony

    Announcements: IQ Societies, ICIQ, etc.

    Keynote: Information Quality Management in a Big Data World: Common Sense Approach to Complex Problems
    Dat Tran, Deputy Assistant Secretary for Data Governance and Analysis, Department of Veterans Affairs
  • 10:30 am - 10:45 amCoffee Break - Ting Foyer
  • 10:45 am - 12:00 pmPlenary Session - Human Factors in Information Quality - Wong AuditoriumChair: Joe Maguire, Data Quality Strategies
    • Mark Temple-Raston, Senior Vice President, Data Management, Enterprise Architecture and IT Governance, Citigroup
    • Peter Kaomea, CIO, Sullivan & Cromwell
    • Maria Villar, Global Vice President, Data Management & Governance, Global Customer Operation, SAP
  • 12:00 pm - 12:30 pmBag Lunch - Ting Foyer
  • 12:30 pm - 1:30 pmKeynote - Wong AuditoriumThe New Data Scientist
    Michael Rappa, Institute for Advanced Analytics, NC State University
  • 1:30 pm - 3:00 pmParallel SessionsSession 3A E51-149
    Annette Pence and Elie Hayeck
    •Role of Authoritative Data Source in Data Quality

    Session 3B E51-395
    Chair: Michael Nix
    3B-1 David Levine, MD, Vice President of Informatics/Medical Director of Comparative Data and Informatics, UHC
    Improving Risk Models for Academic Medical Centers Based on Predictive Analytics
    Abstract:
    For almost 20 years, United Healthcare has been risk-adjusting inpatient mortality, length of stay, and cost for the Academic Medical Centers to help benchmark and drive performance improvement. UHC has made major modifications to the candidate variables. A major change in the mortality models involved using specific secondary diagnoses based on standardized ICD9 disease codes and procedures. In addition, all candidate variables are now required to be present on admission whereas previously they just had to be present sometime on admission. This helps identify potentially preventable complications. Other recent changes include converting the cost models from total cost to direct cost and expanding secondary cancer diagnoses variables from three to more than 20. With the availability of clinical data, the use of specific lab values to enhance and/or replace some of the administrative data is being explored. UHC continues to work with the membership to make its models meaningful and actionable to drive improved performance.

    3B-2 Bruce Davidson, PhD, MPH – Vice-president, Performance Improvement, Hoag Health System
    The Role of IQ Management in Achieving Organizational Performance Excellence
    Abstract:
    This case study provides an analysis of the key role that information quality management plays in organizational success using the Malcolm Baldrige Performance Excellence criteria. The Baldrige program has become an international standard for performance evaluation across industries. It provides a comprehensive framework of factors that make an organization successful, competitive and socially responsible. Mr. Davidson brings a unique perspective to this analysis, having been among the first to receive an Information Quality Certified Professional (IQCP) certification and being appointed to the 2012 Board of Examiners of the Baldrige Performance Excellence Program. This case study includes an overview of the Baldrige framework and a description of the importance of data and IQ to support "Management by Fact", which is a Baldrige core value. We'll also cover the data quality dimensions of the Baldrige program and considerations in approach, deployment, organizational learning, and integration. We address the question of whether an organization can achieve excellence without addressing IQ and offer several examples of the Baldrige framework is applied in the healthcare industry

    Session 3C E51-376
    Chair: Joe Maguire
    3C-1 Jim Newman, Founder Linnean Solutions
    Energy Use Reporting: DQ Problems in the Physical World
    Abstract:
    More than 100,000 privately owned buildings in the U.S. are now required to report energy and water use to municipal authorities, but getting accurate and usable information has been a challenge. Definitions and reporting requirements are often poorly understood by participants. Building owners who consider the information to be low priority don't invest much effort in quality control. There are problems on the regulatory side as well. Municipal records primarily identify tax plots, which may not correspond to actual buildings, while many building owners meter their energy use in multiple buildings with one meter. This session will discuss each of these issues and their importance to the overall goals of the regulatory efforts. The presenter will also review several alternative approaches that are being tested in Massachusetts.

    3C-2 Ethan Goldman, Energy Informatics Architect, Vermont Energy Investment Corporation
    Big Data Doesn’t Just Happen: DQ Problems at the Source
    Abstract
    Smart Grid technologies are bringing big data to the energy efficiency industry, but how do we ensure that data can be used to maximum benefit? While the current emphasis is on data standards, increased storage and analytic capabilities for relative homogeneous utility meter data, much of the value will not be realized until we are able to fuse building-operation data from other diverse sources that can lead to insights about how energy is used. This talk will provide observations from early efforts at data-driven efficiency analysis and the role of data quality in this work.

    Session 3D E51-372
    CDOs in the C-Suite - Who Does the CDO Report To?
    Abstract: CDO is a relatively new title, and one that potentially causes cultural clashes with not only the corporate CIO, but also potentially the Chief Compliance Officer, Chief Privacy Officer, Chief Information Security Officer and even the Chief Marketing Officer. Where should the CDO "sit" in the organization? Is the title even necessary, or sufficient? With perspectives from private industry (in both the healthcare and financial services domains), from the standards world, from the government world and from an IT products & services company, this panel will address how the CDO's requirements best fit into the executive suite, who the CDO should report to, and how the CDO can best carry out the data governance and data exploitation role in today's corporate environments.
    Chair: Richard Soley, Chairman & CEO Object Management Group
    Panelists
    • Eugene Kolker, Chief Data Officer, Seattle Children's
    • Jim Stikeleather, Chief Innovation Officer & Executive Strategist, Dell Services
    • Lt. Col Michael Servaes, SO1 Plans/Change Management, British Army Recruiting and Training Division
  • 3:00 pm - 3:30 pmSponsor Exhibits and Coffee Break
  • 3:30 pm - 5:00 pmParallel SessionsSession 4A E51-149
    Chair: Annette Pence
    • Dave Becker, Principal Information Systems Engineer, MITRE Corporation,
    Big Data, Big Data Problems

    Abstract: “Big data” applies to data sets of extreme size that are beyond the ability of manual techniques and commonly used software tools to capture, manage, and process within a tolerable timeframe. We assume big data will be prone to the same quality problems that plague traditionally-sized data sets: accuracy, precision, completeness, consistency, timeliness, lineage, and relevance. However, these quality dimensions are not yet well understood for big data, and may require completely different strategies and tools. This briefing provides an overview of the initial findings from a survey being conducted by the MITRE Corporation of a diverse set of primarily government big data initiatives in order to gain a better understanding of the specific data quality problems typically being encountered with big data. Using these case studies and a Data Quality Framework from prior MITRE research, we evolve a new Big Data Quality Framework for use in data management of current and future big data initiatives.

    Session 4B E51-395
    4B-1 Hongyun Zhang, PhD, School of Management, Xi’an Jiaotong University
    Emerging Role of Chief Data Officers in the Era of Big Data
    Abstract: Many organizations have already established or are beginning to consider a new position called the Chief Data Officer (CDO). Does this new title actually exist, or is this simply a new name for “CIO?” A recent study attempted to identify the roles and responsibilities of CDO, describe the emerging practice of CDOs and determine how the CDO will manage enterprise data in a world of big data. Nine cases conducted so far have identified three key drivers and triggering events for the rise of CDO: big data, organizational transformation and senior executive support. The study provided empirical evidence that the CDO’s role is indeed different from CIO and suggests that the two roles should have a peer relationship.

    4B-2 W. Zong, Feng Wu and Nouman Muhammad, School of Management, Xi’an Jiaotong University
    Research on Improving Data Quality in Production Management Module During ERP Implementation Based on IPMAP

    4B-3 Gaoliang Tian, Yi Si and Xing Yang, School of Management, Xi’an Jiaotong University,
    Cost-Benefit Analysis of the Accounting Information Quality Ensuring Project

    4B-4 Yuan Yuan, Gaoliang Tian, School of Management, Xi’an Jiaotong University Enterprise Accounting Information Quality Evaluation Based on the Accounting Information Product-Map

    Session 4C E51-376
    Chair: Steve Sarsfield
    4C-1 Michael Nicosia, VP F&A Strategy, Planning, Data & Process Governance – TIAA-CREF
    Preventing Accidents and Mayhem: A Practical Business Approach to Data and Process Governance
    Abstract: Financial services companies struggle with the quality of large data sets, the demands for continual improvement and regulatory oversight. It turns out that a different approach to governance can provide relief. TIAA-CREF has instituted a policy of having the business take accountability for data and implemented some simple “rules of the road” that significantly improve the organization’s capability to effectively manage their data and processes. The session tells how they did it.

    4C-2 Shaun Brady, MITRE.
    Managing Information as a Strategic Enterprise Asset This case study describes the need for data provenance among government regulators of the financial system. Our work is motivated by consideration of the needs of regulators like the Office of Financial Research (OFR), an organization within the U.S. Department of the Treasury with the mission of collecting, integrating, and analyzing diverse data in order to better track and analyze financial systemic risk. We then present an architecture for a Financial Modeling and Analysis Environment which addresses the provenance capture challenge by integrating a provenance manager, an open source data transformation tool, and a novel simulation execution environment. We briefly describe a prototype implementation and our future directions.

    Session 4D E51-372
    • Ahmed Abukhater, PhD, GISP Global Director of Product Management Pitney Bowes Software USA
    Turning Data into Operational Intelligence This presentation will provide an overview of the concept of operational intelligence and offer future directions and examples of how it can be used as a platform to take advantage of “big data” available to us in the pursuit of organizational productivity and efficiency.
    Data alone is not useful in its own right unless it is utilized to conduct meaningful analysis and reveal actionable insight. Useful insight capable of delivering business value can only be reached when data is operationalized in a way that derives measurable business impacts. This can be accomplished when data is combined with location intelligence to support various business workflows and operations. This way, stranded location value can be exposed and hidden nuances can be revealed enabling us to connect people, places and things. Monetizing the value of data is a function of location data management best practices and operational efficiency that leads to the opportunity to extend existing markets or create new ones.
  • 5:00 pm - 6:30 pmSymposium Reception Sponsored by Global IDs at E51-345
July 18, 2013
  • 7:30 am - 8:50 amSPONSOR/VENDOR EXHIBITS and CONTINENTAL BREAKFAST & REGISTRATION - Ting Foyer
  • 8:50 am - 9:05 amAnnual MIT IQ and Data Science Survey - Wong AuditoriumYang Lee, Northeastern University; Associate Director, MIT IQ & Data Science Program;
  • 9:10 am - 10:30 amPlenary Session - CDO in Action: Success Stories and Lessons Learned - Wong AuditoriumChair: Derek Strauss, CDO, TD Ameritrade
    Panelists:
    • Mario Faria, CDO, Boa Vista, Brazil
    • Bruce Davidson, Vice-president for Performance Improvement, Hoag Health System
    • Frank Ponzio, Chairman & CEO, Symbolic Systems
  • 10:30 am - 10:45 amCoffee Break
  • 10:45 am - 12:15 pmParallel SessionsSession 6A E51-057
    Chair: Annette Pence
    6A-1 James C. Meng, Deputy Assistant Secretary of the Navy for Business Architecture, Cost Standards & Intelligence Integration, Office of the ASN(FMC)
    Navy Data Standardization and Integration for Business Intelligence

    6A-2 Micheline Casey, Chief Data Officer at the Board of Governors of the Federal Reserve System
    Becoming a Data-Centric Organization: Why the Role of Chief Data Officer is Critical Now
    Abstract: CIOs are stewards of the organization's data, but few actually spend much time managing data. Instead, they’re focused on infrastructure and the portfolio of applications, which has grown in proportion to the amount of data. Truly data-centric organizations need to leverage data resources across the enterprise to support the business, and that demands high-level responsibility.

    This presentation provides a brief historical perspective of the role of CIO and outlines what is needed now for the top data job. It covers organizational re-alignment, data-centric development, the roles and responsibilities of top data job and how it aligns with the rest of the organization.

    Session 6B E51-149
    Chair: Elisa Horbatuk
    6B-1 Maury DePalo, Director & Principal Consultant, Edgewater Healthcare Consulting Practice
    Achieving Accountable Care – Integrating IQ with Operations
    Abstract: The growth of accountable care organizations (ACO) presents new challenges in data quality. ACOs agree to tie reimbursements to documented reductions in the cost of health care, but this requires them to combine clinical, operational, financial and research data from numerous sources. They must develop expertise in such areas as identifying unique patients, attributing care providers by their role in clinical encounters; base-lining and tracking outcomes and expenditures across a range of settings and equitably allocating shared risks and rewards to participating entities. This demands a robust program of information quality management and governance.

    This presentation explores these IQ challenges with a particular emphasis on how solutions are being devised in decentralized operating models.

    6B-2 David Harriman, Director of Performance Evaluation & Improvement, Newport Hospital
    Applying IQ Thinking to Configuring Clinical Information Systems
    Abstract: The healthcare professionals who use electronic health records (EHR) on the front lines are often faced with bewildering complexity and lack of understanding about how the systems they rely upon actually work. Forced to trust systems that they don't understand and can't control, users may experience a learned helplessness that leads to poor EHR practices and compromised care. Clinical knowledge workers who feel that must change their clinical practice to compensate for badly designed information systems will fail to realize the full benefits of such systems at best and actively resist using them at worst. This presentation offer practical examples of how education about the data supply chain combined with a focus on patient care can create a sense of urgency and willingness to engage in process change.

    Session 6C E51-145
    Chair: Steve Sarsfield
    6C-1 David Hay, President, Essential Strategies, Inc. and Tom Redman, President, Havasink Enterprises
    Channels of Information and Financial Regulation
    Abstract: Just as a pilot depends on an understanding of the natural laws of thermodynamics to fly a plane, a corporate manager or regulator needs to understand communications and feedback loops in order to continually adapt and change a business. This is the science of cybernetics, and data quality is an essential element. This presentation explores ways to make communications channels as effective as possible so that messages are received clearly and important information isn't lost because it's buried in a five-pound report. Understanding the natural laws of cybernetics is crucial to building a sustainable business. We explore their use in a corporate and/or regulatory environment.

    6C-2 Paul Brown, Chief Architect, Paradigm4
    Avoiding Big-Data-Scale DQ Problems with the Array Database Model
    Abstract: Data Quality problems are more than just typos and misspellings. Many quality problems originate in data models and database meta-models. Big Data initiatives are particularly susceptible to such problems. That’s because Big Data tools generally offer relaxed meta-models, amenable to “schemaless” implementations. From Paradigm4, the maker of massively scalable array database SciDB, Paul introduces the array data model, describes how it accommodates complex analytics on massively scalable data sets, and discusses how it avoids the data-quality problems that arise in typical Big Data initiatives.

    Session 6D E51-151
    Chair: Dan Schutzer, Financial Services Roundtable & BITS
    • John Bottega, Chief Data Officer, Bank of America
    • Justin Magruder, President & Senior Managing Director, Noetic Partners
    • Michael Atkin, Managing Director, EDM Council
    Data Quality in Financial Services – Its Relationship to Risk Management, Better Decision-making and Regulatory Reporting
    Abstract: Financial industry regulators are demanding transaction transparency, financial stability and cross-asset market surveillance, which is driving firms to reexamine their data management processes and implement data governance programs. This session provides an overview of the data challenges associated with the changing global regulatory landscape and the need to ensure that the data being used to ensure transparency and support systemic risk analysis is comparable, appropriately classified and fit-for-purpose. Participants will learn how the financial industry is creating the steps they need to meet data quality and data governance obligations, and how data attributes can be separated from risk aggregation processes to ensure consistency and apples-to-apples alignment.

  • 12:15 pm - 12:45 pmPick up Box Lunch - Ting Foyer
  • 12:45 pm - 1:45 pmPlenary Panel Session -Data Quality Implications and Opportunities in National Health Care Transformation - Wong AuditoriumAbstract: As the Baby Boom generation accelerates its migration into retirement,the health care delivery system of the United States faces significant challenges. It is estimated that 44 million new members will be added to the roles of Medicare and Medicaid alone. Currently, these programs handle approximately 22 million members. US providers and payers face additional challenges in cost effectively provisioning the infrastructure and capacity necessary to serve this rapidly expanding population. It is widely recognized that major changes are needed in both the volume and method of medical service delivery, and organizations from the Federal Government down to home health care providers are looking to Information Technology and automation to help provide solutions. This includes significant expectations for advanced analytics that leverage increasingly large sets of high quality data to help ensure care outcomes, costs, and quality of service delivered are trending in the right direction. This lively panel discussion will shed some light on what is currently being done to ensure data integrity from three points of view – Federal Government, Private Sector, and State Government, sharing insights on emerging best practices, opportunities, and needs.
    Chair: Mark I. Johnson, CEO / Chief Business Development Executive Gavroshe USA, Inc.
    Panelists:
    • Farzad Mostashari – Chief of the Office of the National Coordinator for Health Information Technology – United States Federal Government
    • Anthony Donofrio – Chief Technology Officer for Truven Health Analytics
    • Donald George - Chief Architect - Advanced Analytics Technology, PRISM Communication Systems, Inc. Former CTO State of Georgia
  • 1:45 pm - 3:15 pmParallel SessionsSession 7A E51-149
    Chair: Annette Pence
    7A-1 Frank Ponzio, President & CEO, Symbolic Systems, Inc. and David Becker, Principal Information Systems Engineer with the MITRE Corporation
    QA on Data Quality

    7A-2 Calla Knopman, Founder, Knopman IT Consulting, LLC
    Big Data in the Cloud
    Abstract: While big data is quickly becoming a ‘must have’ for most organizations to enable predictive analytics and reduce time to value, many are not able to effectively employ a solution. This presentation will identify requirements for a successful implementation, point out some of the most common risks and mitigation strategies and propose the cloud as an opportunity to get a jump start on your big data initiatives. Utilizing cloud services can be a tactical first step in testing the big data waters, calculating your potential ROI and defining your strategic goals and vision in the big data world.

    Session 7B E51-145
    Chair: Allen Juris
    7B-1 Naeem Hashmi, Founder & Chief Research Officer, Information Frameworks
    Information-Driven Health Ecosystem & Patient Safety
    Abstract: Healthcare is one of the world's most fragmented industries, and merely integrating clinical content from electronic health records (EHR) doesn't solve the quality problem. In fact, EHRs can now spread bad data faster than ever. As healthcare decisions are increasingly made based upon derived data from many sources, the importance of data quality has never been higher. In this session, the speaker shares his experiences tackling these challenges. Topics include:
    --Understanding the healthcare data domain and establishing controls;
    --Building a framework for clinical content harmonization and integration;
    --Embedding measurements to detect and address streaming content quality issues; --Improving patient safety through content syndication services

    7B-2 Peter Aiken, Founding Director, Data Blueprint
    Data – It Shouldn't Be This Hard: Lessons From the Trenches
    Abstract: When healthcare organizations fail to address data challenges, poor organizational data management maturity is often the root cause. This session provides four lessons from healthcare and non‐healthcare industries that help establish a realistic DM plan:
    1. Organizational thinking must change and data management practices like quality control and governance must align to business needs.
    2. Organizations need to first understand and implement these prerequisites in order to build upon a solid foundation.
    3. Business practices and data governance are usually more important than tools.
    4. A process of continuous planning and feedback (agile) ensures that value is maximized throughout the development process.

    Session 7C E51-151
    Chair: Steve Sarsfield
    7C-1 Arun Sukuman, Sheffield Business School, Markus Helfert, Information Systems, Dublin City and Tony O’Brien, Sheffield University
    The Challenge to Tackle Information Risk: European Case Study
    Abstract: There are many frameworks, maturity models and metrics that look at the role of information quality in decision-making, but most follow a top-down approach. This session evaluates an integrated framework for bottom-up quality management, using a large UK-based organization as an example. We explore the influence of information quality on organizational performance and evaluate the leverage obtained from a focus on data quality.

    7C-2 Bob Schmidt, Data Steward for Capital Markets, Wells Fargo
    Data Valuation Methods from Wells Fargo
    Abstract: Businesses demand ROI, but putting a dollar value on data quality has long been a challenge. This session presents a valuation method patented by Wells Fargo that's designed to motivate good behavior by those who maintain databases. It rejects many traditional metrics in favor of activities that are broadly accepted as good for the business. It means to provide a practical, measurable approach to valuation that puts a premium on good practices.

    Session 7D E51- 345
    New Trends and Directions in Data Science
    Abstract : Data management and analytics are now playing a large role in organizations. However, what is going to disrupt this market soon and where are we going from here ? This session will touch those points and show some innovations you should be aware of.
    Chair: Mario Faria, MIT Data Science Initiative
    • J. Andrew Rogers, CTO, SpaceCurve
    • Matt Piekarczyk, CEO, Cortix Systems

  • 3:15 pm - 3:30 pmSponsor/Vendor Exhibits & Coffee Break
  • 3:30 pm - 5:00 pmMultiple 30 Minute Case StudiesSession 8A Public/Business Client Case Studies by Sponsors E51-149
    Chair: Robert Lutton, Sandhill Consultants

    8A-1. RISE(ing) to Make New Standards in Quality Education by Informatica
    Abstract: How the State of Colorado RISE program uses MDM, Data Quality and Big Data Analytics to drive better outcomes for its students.

    8A-2.Transforming the Enterprise Data Architecture in the Corporate Environment using EM-SOS!™Sandhill Consultants
    Abstract: A major US corporation is in the process of transforming and globalizing its data architecture. Learn how they are implementing a robust data management life cycle and associated standards, procedures and technology to achieve high information quality and interoperability. A new set of standards and life cycle framework has been developed using Sandhill Consultants’ Enterprise Modeling Set of Standards, EM-SOS! Benefits are already being realized in internal operations with improved collaboration between Business and IT, one of the key success factors for their transformation. And more benefits are expected to follow.

    Session 8B Health Care Client Case Studies by Sponsors E51-145
    Chair: Lewis Broome, Data Blueprint

    8B-1. Data Quality Visualization for Healthcare Master Data
    Speaker: Dr. Arka Mukherjee, Founder & CEO, Global IDs.
    Abstract: Most Healthcare organizations run their core business on a set of data objects, collectively called the Master Data Portfolio. This presentation describes how the Master Data Portfolio can be systematically measured, governed and visualized.

    8B-2. Valuating Data Initiatives in Health Care by Data Blueprint
    Abstract: The Health Care space offers the same opportunities to demonstrate return on dollars invested in Data Management Practices as do other types of businesses. By this we mean increasing revenues or decreasing costs, investments that directly impact a facility’s or organization’s financial position. But the Health Care space also affords a unique opportunity to demonstrate other very tangible types of return on investment, or value outcomes. In this case study, Data Blueprint will describe multiple types of value produced through data centric development and management practices. In a session that will appeal to both clinicians and administrators, financial returns and return in the form of lives saved through increased rates of Bone Marrow Donor matches will be used as a means to justify Data Management and Quality initiatives.

    8B-3 Big Data in HealthCare – Pitfalls To Avoid by Dr. Satyam Priyadarshy, VP - Data Science Acxiom Global Consulting
    Abstract: Healthcare certainly fits the model for 7 V’s model of Big Data. The value to healthcare depends upon the maturity of Big Data. In this talk we will discuss aspects of Big Data Maturity Model that will guide us to avoid serious pitfalls in implementing and leveraging Big Data in Healthcare. Use cases will be discussed.

    Session 8C Business & Finance Client Case Studies by Sponsors E51-151
    Chair: Mark Johnson, CEO, Gavroshe

    8C-1. SAP Governance Journey - Tina Tang, SAP
    Abstract: Get a glimpse of how a leading business software vendor practices what it preaches. In this session, we will describe how an internal program called One SAP for Data Quality has helped the company to realize 31m € in business benefits since 2010. By instilling a culture of data ownership and quality at SAP, our goal is to increase employee productivity and drive margin improvement. By partnering with business leads, One SAP strives to achieve these goals by increasing the quality of enterprise master data across all lines of businesses. The organizational structure used by SAP to drive information governance is a balance of a centralized data governance council and center of expertise, combined with decentralized regional execution and line of business and business process owners, with data ownership models, business requirements definition, and benefit delivery.

    8C-2. Our Journey to Improved Data Quality Gavroshe & TIAA-Cref
    Abstract: This case study discusses the approach TIAA-CREF is using to improve Information and Data Quality that leads to better business outcomes. The journey begins with defining the opportunity, establishing a transformation program and progresses through the framework and approach, which includes maturing Data Governance & Stewardship, developing a Playbook and Enterprise Data Services. During the session we'll discuss the roles of Metadata, Data Profiling and a Data Quality Dashboard as the technologies under pinning the Enterprise Data Services. We'll show how these capabilities are leveraged by Data Stewards and Project teams following the Playbook methodology.

    8C-3 Creating the Optimum Value from your Data Asset with DQ Scorecarding
    Speaker: Christiana Klingenberg, Uniserv GmbH
    Abstract: Companies focusing on data quality express their need to establish a data quality key performance indicator. With this KPI the possibility to measure and observe data quality defined by individual business rules within the entire company is given. Using the example of an international building materials manufacturing company we describe the specific data quality challenges of customer data and demonstrate how the DQ KPI is used to improve data quality. We explain the set up of the DQ KPI and demonstrate how the measured results of a single business rule can influence the final DQ KPI.
  • 5:00 pm - 6:30 pmPartner Exhibit Reception - Ting Foyer
July 19, 2013
  • 7:30 am - 8:50 amSponsor/Vendor Exhibits and Continental Breakfast & Registration - Ting Foyer
  • 9:00 am - 10:15 amMIT: The State of the CDO - E51-345Chair: Genia Neushloss, Gavroshe
    Rich Wang, Stuart Madnick, Yang Lee
  • 10:15 am - 10:30 amCoffee Break - Ting Foyer
  • 10:30 am - 12:00 pmParallel SessionsSession 9A E51-057
    Chair: Annette Pence
    • Doug Whall, Senior Principal Architect, LinQuest
    A Framework for Achieving Better Data Quality By Aligning Data Governance Approaches in Big Data Environments
    Abstract: This presentation addresses the impact of big data on information quality the broader IT environment. It outlines a framework for ensuring that governance principles keep pace with a world in which the Four Vs - volume, velocity, variety, and veracity - of data is moving faster than ever. The speaker will explain how big data, new technologies, new regulations, and new expectations concerning the value of enterprise data can achieve the right balance between planning and control while still enabling innovation and speed.

    Session 9B E51-149
    Chair: Michael Nix
    • Larry Mandlekehr, Director of Performance Improvement, UNC Medical Center
    Meaningful Use – View From the Inside
    Abstract: "Meaningful use" is a set of standards that are intended to motivate healthcare providers to make better use of electronic medical records, but achieving meaningful use goals can be challenging. Efforts to apply these standards can turn up differences in terminology, definitions and code sets, divergence between quality initiatives and difficulties in capturing and validating a hospital’s performance. This session first presents the challenges of collecting data that accurately describes patients and their experiences. It then uses examples to cover the importance of consistent data definitions and coordination between information services and clinical process owners in their application of meaningful use standards. We'll also describe examples of good project planning, variance-handling and audit preparation.

    Meaningful Use
    Chair: Michael Nix, Director of Analytics of the James M. Jeffords Institute for Quality and Operational Effectiveness, Fletcher Allen Health Care
    Panelists:
    • Adam Buckley, MD, Chief Medical Information Officer, Fletcher Allen Health Care
    • Larry Mandelkehr, Director of Performance Improvement, UNC Healthcare System
    • David Harriman, MA, CPHQ, Enterprise Information Management, Core Reporting Team, Lifespan Health Systems

    Session 9C E51-145
    David Eddy, Principal, Legacy Software, Ltd.
    David will present a sprint through a collection of events in the history of commercial information quality, touching on:
    - Mercantile pricing
    - Henry Ford’s calipers
    - General Georges Doriot’s “S” curve
    - Wang Laboratories
    - Consistent labeling
    - Data element naming conventions
    - Big Data

    Session 9D E51-151
    New Directions in Financial Services Standards: Feeding the CDO's Soul
    Abstract: With industry compliance and privacy directives flooding financial services (and other organizations') infrastructure requirements, IT standards have to mold to fill the immediate needs of data governance and data exploitation in organizations. Standards have been delivered piecemeal -- XBRL has transformed corporate reporting, ISO 20022 and LEI are starting to have that effect for messaging and contract linkage, and EDM Council and OMG's FIBO is likely to tie it all together, when it finally lands. This panel, with perspectives from the financial services industry, the standards development organizations, IT vendors and government regulators, will explore a "big picture" of how IT standards can integrate & simplify the CDO's role.
    Chair: Said Tabet, Senior Technologist
    and Industry Standards Strategist, Corporate CTO Office, EMC Corporation
    • Richard Soley, CEO, OMG
    • David Saul, Chief Scientist, State Street Bank
    • David Blaszkowsky, Senior Policy Advisor, Office of Financial Research,
    U.S. Department of the Treasury