Tech. Problem Formulation: Requires knowledge of analytics/big data analytics / automation techniques and methods; Business understanding; Precedence and use cases; Business requirements and insights. To identify possible options to address the business problems within one's discipline through relevant analytical methodologies. Demonstrate understanding of use cases and desired outcomes.
Understanding Business Context: Requires knowledge of industry and environmental factors; Common business vernacular; Business practices across two or more domains such as product, finance, marketing, sales, technology, business systems, and human resources and in-depth knowledge of related practices; Directly relevant business metrics and business areas. To support the development of business cases and recommendations. Drive delivery of project activity and tasks assigned by others. Support process updates and changes. Support, under guidance, in solving business issues.
Data Quality Management: Requires knowledge of data quality management techniques and standards; Business metadata definitions and content data definitions; Data profiling tools, data cleansing tools, data integration tools, and issues and event management tools; Understanding of user's data consumption, data needs, and business implications; Data modeling, storage, integration, and warehousing; Data quality framework and metrics; User access best practices; Enterprise data architecture, modeling and design, storage, integration, and warehousing; Enterprise data quality framework and metrics; Enterprise data strategy; Enterprise data quality strategy; Enterprise strategy to address regulatory and ethical requirements and policies around data privacy, security, storage, retention, and documentation.
To promote data quality awareness. Profile, analyze, and assess data quality. Test and validates data quality requirements under supervision of others. Execute operational Data Quality Management procedures under supervision of others. Conduct data cleansing activities to remove data quality defects, improve data quality, and eliminate unused data under supervision of others. Grant user access to data. Learn company and regulatory policies on data. Learn data governance processes, practices, policies, and guidelines.
Data Transformation and Integration: Requires knowledge of internal and external data sources including how they are collected, where and how they are stored, and interrelationships, both within and external to the organization. Techniques like ETL batch processing, streaming ingestion, scrapers, API and crawlers; Data warehousing service for structured and semi-structured data, or to MPP databases such as Snowflake, Microsoft Azure, Presto or Google BigQuery; Pre-processing techniques such as transformation, integration, normalization, feature extraction, to identify and apply appropriate methods.E6; Techniques such as decision trees, advanced regression techniques such as LASSO methods, random forests etc; Cloud and big data environments like EDO2 systems. To identify and understand suitable extraction software. Review data from a quality perspective based on the guidelines given. Support data processing.
Code Development and Testing: Requires knowledge of coding languages like SQL, Java, C++, Python and others; Testing methods such as static, dynamic, software composition analysis, manual penetration testing and others; Business, domain understanding. To write code to develop the required solution and application features by using the recommended programming language and leveraging business, technical, and data requirements. Test the code using the recommended testing approach.
Data Source Identification: Requires knowledge of functional business domain and scenarios; Categories of data and where it is held; Business data requirements; Database technologies and distributed datastores (e.g. SQL, NoSQL); Data Quality; Existing business systems and processes, including the key drivers and measures of success. To support the understanding of the priority order of requirements and service level agreements. Help identify the most suitable source for data that is fit for purpose. Perform initial data quality checks on extracted data.
Data Governance: Requires knowledge of data value chains; Data processes and practices; Regulatory and ethical requirements around data; Data modeling, storage, integration, and warehousing; Data value chains (identification, ingestion, processing, storage, analysis, and utilization); Data quality framework and metrics; Regulatory and ethical requirements around data privacy, security, storage, retention, and documentation; Business implications on data usage; Data Strategy; Enterprise regulatory and ethical policies and strategies.
Data Modeling: Requires knowledge of cloud data strategy, data warehouse, data lake, and enterprise big data platforms; Data modeling techniques and tools (For example, Dimensional design and scalability), Entity Relationship diagrams, Erwin, etc; Query languages SQL / NoSQL; Data flows through the different systems; Tools supporting automated data loads; Artificial Intelligent - enabled metadata management tools and techniques. To analyze complex data elements, systems, data flows, dependencies, and relationships to contribute to conceptual, physical, and logical data models. Develop the Logical Data Model and Physical Data Models including data warehouse and data mart designs. Define relational tables, primary and foreign keys, and stored procedures to create a data model structure. Evaluate existing data models and physical databases for variances and discrepancies. Develop efficient data flows. Analyze data-related system integration challenges and proposes appropriate solutions. Create training documentation and trains end-users on data modeling. Oversee the tasks of less experienced programmers and stipulates system troubleshooting supports.
Master Data and Metadata Management: Requires knowledge of understanding of user's data consumption, data needs, and business implications. Data modeling, storage, integration, warehousing, and relational databases. Data value chains (for example, identification, ingestion, processing, storage, analysis, and utilization). Master data, golden records, data hierarchies, and connections to transactional data. Business, technical, process, and operational metadata architecture, standards, definitions, and repositories. Regulatory and ethical requirements and policies around data privacy, security, storage, retention, and documentation. Master data management applications, database management systems, data and process modeling tools, relational database tools, data profiling tools, and data integration tools. Master data, golden records, data hierarchies, and connections to transactional data. Business, technical, process, and operational metadata architecture, standards, definitions, and repositories. Regulatory and ethical requirements and policies around data privacy, security, storage, retention, and documentation.
To understand business domain master integration and meta data requirements. Implement master data management solutions and manage metadata environment. Manage and maintain metadata standards and data rules. Manage and maintain "golden" records. Lead changes and revisions to master data, metadata, and data hierarchies and affiliations within given guidelines. Implement integration of new data sources and metadata. Replicate and distribute master data. Distributes and delivers metadata. Queries, reports, and analyzes metadata. Educate others on master data and metadata management processes, practices, policies, and guidelines.
Demonstrates up-to-date expertise and applies this to the development, execution, and improvement of action plans by providing expert advice and guidance to others in the application of information and best practices; supporting and aligning efforts to meet customer and business needs; and building commitment for perspectives and rationales.
Provides and supports the implementation of business solutions by building relationships and partnerships with key stakeholders; identifying business needs; determining and carrying out necessary processes and practices; monitoring progress and results; recognizing and capitalizing on improvement opportunities; and adapting to competing demands, organizational changes, and new responsibilities.
Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity by incorporating these into the development and implementation of business plans; using the Open Door Policy; and demonstrating and assisting others with how to apply these in executing business processes and practices.
Option 1: Bachelor’s degree in Computer Science or related field and 2 years' experience in data modeling, data analytics, data warehouse, software engineering or related field. Option 2: 4 years’ experience in data modeling, data analytics, data warehouse, software engineering or related field. Option 3: Master's degree in Computer Science or related field.
Data Modeling, Master's degree in Computer Engineering, Computer Science, Information Systems, Relevant industry experience (for example, retail, supply chain, eCommerce, healthcare, etc.)
508 SW 8TH ST, BENTONVILLE, AR 72712, United States of America