Data Lifecycle Management

Preconfigured templates, Auto Assignment of Class & Characteristics, ISO 8000 and UNSPSC compliant

About Data Lifecycle Management

Data Quality Management is aimed to automate the process of standardization, cleansing & management of unstructured/free text data by utilizing ASA (Auto Structured Algorithms) built on ICS’s taxonomy and the catalog repositories of master data records.

Data Quality Management Capabilities includes but not limited to
  • Analyze the source data content for completeness, consistency, redundancy, standardization, richness, etc
  • Auto Assignment of Class & Characteristics from the ICS's Taxonomy to each record
  • Extract the characteristic values & UOM's from the source descriptions for each record
  • Extract reference data from the source descriptions such as Part#/Model#/Drawing#/Mnfr/Vendor etc for each record
  • Bulk review of materials (QC Tools & DQ Assessment)
  • Auto mapping of source data with ICS repositories & other reliable sources
  • Assign the data sets to relevant user groups based on various criteria
  • Capture additional information (or) Validate processed/structured data
  • Provision to collect the field data & update (Physical Walk down)
  • Auto-generation of Short & PO text based on User configured
  • Identification of redundant records
  • Export the data to be migrated to a Target system(s)
  • Integrate in real-time with other systems
  • Data Quality assessment & progress reports

Spares Data Acquisition

Criteria MM VM CM SM PM HR
Count 2,182 Classes Completed - - - 2K+ Classes Completed -
Validation ERP Fields validation General, Taxation, Banking Details Validation General, Taxation, Banking Details Validation ERP Fields validation ERP Fields are validated Address Fields Validation
Verification ERP Fields verification General, Taxation, Banking Details Verification General, Taxation, Banking Details Verification ERP Fields verification ERP Fields verification Address Fields Verification

Data Quality Standards

The International Organization for Standardization (ISO) approved a set of standards for data quality as it relates to the exchange of master data between organizations and systems. These are primarily defined in the ISO 8000-110, -120, -130, -140 and the ISO 22745-10, -30 and -40 standards. Although these standards were originally inspired by the business of replacement parts cataloguing, the standards potentially have a much broader application. The ISO 8000 standards are high level requirements that do not prescribe any specific syntax or semantics. On the other hand, the ISO 22745 standards are for a specific implementation of the ISO 8000 standards in extensible markup language (XML) and are aimed primarily at parts cataloguing and industrial suppliers ICS Data Harmonization processes & methodologies complies to ISO 8000 & ISO 22745 standards

ICS utilizes the ICS Preferred Ontology (IPO) when structuring and cleansing Material, Asset/Equipment & Services Master records ensuring data deliverables comply with the ISO 8000 methodology, processes & standards for Syntax, Semantics, Accuracy, Provenance and Completeness

Taxonomy

ICS throughout the 25 years of experience in master data solutions across different industries have developed ICS Preferred Ontology (IPO) which is a Technical Dictionary that complies with the ISO 8000 standard. The IPO is a best-defined industry specific dictionary covering all industry verticals such as Petrochemical, Iron & Steel, Oil & Gas, Cement, Transport, Utilities, Retail etc.

ICS's Taxonomy consists of pre-defined templates. Each template consists of a list of classes (object-qualifier or noun-modifier combination) with a set of predefined characteristics (properties/attributes) per class. ICS will make the IPO (class/characteristics/abbreviations) available for general reference via the Data Harmonization Solution (DHS) and Master Data Ontology Manager (MDOM) tools.

ICS reserves the right to make these changes in the dictionary:

  • Change existing classes and or characteristics of the CPO where necessary.
  • Register new classes and or characteristics in the OTD, and add them to the CPO.
  • If changes are made to the CPO dictionary, ICS will only update the changes/additions in the MDPM software tool; no additional approval is required from Client to incorporate the changes, as ICS manages the CPO dictionary according to industry standards.
  • The CPO dictionary is the intellectual property of ICS. In no way may it be edited, copied, compared, mapped, transmitted, imported/exported into other software/systems, or printed/published without prior written permission of ICS. CPO includes concepts, classes, terms, definitions, languages, abbreviations, data requirements, equivalences, images, data types, translations, and any data structures or relationships of the content stored within the CPO.

Data Cleansing

The ICS Master Data Project Management is used for cleansing and structuring of a material master is a highly specialized field, requiring the use of international standards, such as eOTD, USC, EAN, ISO 8000 etc. Effective cleansing and structuring of a material master consistent and correct application of these standards in large volumes of data requires specialized processes, methodologies and software tools.

The material master forms the basis for a myriad of business objectives. ICS understands the complex task of translating selected business objectives into master data requirements and subsequently designing a project that is focused on delivering optimal results in a cost effective way.

For a large number of line items effective cleaning of the material master does require the cleaning and standardization of the manufacturers and/or suppliers. It therefore follows that a vendor/supplier cleanup and standardization is a logical consequence in the process

ICS has its own specialized data refinery, ICS Data. ICS has developed superior technology and methodologies that are aimed at delivering the best possible quality, consistently and cost effectively.

In answering the market need for the cataloguing of services in a consistent and repetitive manner, ICS developed the world's first internationally proven standard for services cataloguing, the USC. Although this has now been accepted to be part of the eOTD, the specific methodologies required to implement it successfully remains with ICS.

The material master, as well as other master data tables, requires standardized base tables for, amongst others; unit of measure, unit of purchase, material types and material groups. This is also a specialty of ICS.

Data Cataloging

Data cataloging is classified below methods:

Basic Structuring (Reference data extraction & Allocation of a IPO-OTD class)-

  • Reference values, such as manufacturer name and part number; drawing number, supplier references, and any other reference values of the item, are identified and captured. These are used to describe the item during purchasing.
  • Each item is assigned to a IPO class. This allows all items to be grouped, as defined by the dictionary, and to have corresponding templates assigned to items.

Advanced Structuring (Value extraction)

  • Allocation of IPO - OTD properties (template), Attribute/Property values, UOM's extraction & population, Cleaning & structuring of free text, if any

Enrichment

  • Data enrichment is performed with the help of genuine reference data that is extracted from the source data by means of external research to generate additional data from technical sources, such as manufacturer's catalogs (PDF library) or the Internet (manufacturer's website, catalog cuts, etc.), as well as cross-verification and cross-validation of extracted data that is captured in Advanced Structuring. Further cleansing occurs through validation, cross-reference, and harmonization.

Cataloging Rules

ICS follows three cataloging rules as follows:

  • Do not remove or delete any data provided by the client unless the data is duplicated.Duplicated in this context refers to the scenario where a word, concept, value, attribute, etc. is duplicated within a single description or text provided for an item.
  • Records are never deleted by ICS, but will be flagged as potential duplicates. It is the client's responsibility to verify and confirm whether items flagged as potential duplicates are indeed duplicates before removing them from the item master set.
  • Do not add extra values to client data unless researched from a source with integrity and authority. If ICS adds values to a client's master data item, ICS provides the source and authenticity for the added data.
  • If descriptions are incomplete, incorrect, or contain conflicting information, query the client before assigning class or values. ICS does not assign a class if the source description or information provided by the client is unclear. ICS seeks additional information or a decision from the client; record(s) with pending queries are kept on hold until the query is resolved.
  • Electronic Data Verification (EDV) is the process whereby the source data received from the client is processed into the cataloguing system via the EOTD dictionary, where the correct item name and approved Attribute template is
  • linked, and the data for the material item is populated into the template. Descriptions are then generated according to certain rules. There are different Levels of cataloguing.