The Kimberly-Clark Information Technology Solutions (ITS) function is looking for a Data Architect.
The Data Architect will play a pivotal role in operationalizing the most-urgent data and analytics initiatives for Kimberly-Clark’s digital business initiatives. The bulk of the data architect’s work would be in building, managing and optimizing data pipelines and then moving these data pipelines effectively into production for key data and analytics consumers.
The Data Architect also need to guarantee compliance with data governance and data security requirements while creating, improving and operationalizing these integrated and reusable data pipelines. This would enable faster data access,
integrated data reuse and vastly improved time-to-solution for K-C’s data and analytics initiatives. The Data Architect will be measured on their ability to integrate analytics and (or) data science results with K-C’s business processes.
This role will require both creative and collaborative working with IT and the business. It will involve evangelizing effective data management practices and promoting better understanding of data and analytics. The data architect will also be tasked with working with key business stakeholders, IT experts and subject-matter experts to plan and deliver optimal analytics and data science solutions. Additionally data architects will also be expected to collaborate with data scientists, data analysts and other data consumers and work on the models and algorithms developed by them in order to optimize them for data quality, security and governance and put them into production leading to potentially large productivity gains.
The role will report to the Head of Analytics Center of Excellence within the IT Engineering and Applications organization. Role will not have any direct reports.
This role requires interfacing with multiple Line of Businesses, Data Scientists, Platform Engineers, Data Stewards, Functional and Solution Engineers, Enterprise Architects, Security and Cloud Solutions.
Duties and Responsibilities:
Demonstrate proficiency in multiple DevOps related tools and technologies.
- Build Data Pipelines: Managed the creation, maintenance, training and optimization of data pipelines as workloads move from development to production for specific use cases. Architecting, creating and maintaining data pipelines will be the primary responsibility of the data architect.
- Drive Automation through effective metadata management: The data architect will be responsible for using innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. The data architect will also need to assist with renovating the data management infrastructure to drive automation in data integration and management.
- Learning and applying modern data preparation, integration and AI-enabled metadata management tools and techniques.
- Tracking data consumption patterns.
- Performing intelligent sampling and caching.
- Monitoring schema changes.
- Recommending — or sometimes even automating — existing and future integration flows.
- Collaborate across departments: The senior data architect will need strong collaboration skills in order to work with varied stakeholders within the organization. In particular, the data architect will work in close relationship with data science teams and with business (data) analysts in refining their data requirements for various data and analytics initiatives and their data consumption requirements.
- Educate and train: The data architect should be curious and knowledgeable about new data initiatives and how to address them. This includes applying their data and/or domain understanding in addressing new data requirements. They will also be responsible for proposing appropriate (and innovative) data ingestion, preparation, integration and operationalization techniques in optimally addressing these data requirements. The data architect will be required to train counterparts in these data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases.
- Participate in ensuring compliance and governance during data use: It will be the responsibility of the data architect to ensure that the data users and consumers use the data provisioned to them responsibly through data governance and compliance initiatives. Data architects should work with data stewards and participate in vetting and promoting content created in the business and by data scientists to the curated data catalog for governed reuse.
- Become a data and analytics evangelist: The senior data architect will be considered a blend of data and analytics “evangelist,” “data guru” and “fixer.” This role will promote the available data and analytics capabilities and expertise to business unit leaders and educate them in leveraging these capabilities in achieving their business goals.
Skills and Experience:
- B.A. or B.S. in Information Technology, Data Science, or related field.
- The ideal candidate will have a combination of IT skills, data governance skills, analytics skills with a technical or computer science degree.
- At least 8 years of IT experience and 4 years or more of work experience in data management disciplines including data integration, modeling, optimization and data quality.
- Strong experience with advanced analytics tools for Object-oriented/object function scripting using languages such as [R, Python, Java, C++, Scala, others].
- Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management.
- Strong experience with popular database programming languages including [SQL, Blob Storage and SAP HANA] for relational databases and certifications on upcoming [MS Azure HDInsights, Cosmos] for nonrelational databases.
- Strong experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using traditional data integration technologies. These should include [ETL/ELT, data replication/CDC, message-oriented data movement, API design and access] and upcoming data ingestion and integration technologies such as [stream data integration, CEP and data virtualization].
- Strong experience in working with and optimizing existing ETL processes and data integration and data preparation flows and helping to move them in production.
- Strong experience in streaming and message queuing technologies [such Azure Service Bus, and Kafka].
- Basic experience working with popular data discovery, analytics and BI software tools like [Tableau, PowerBI and others] for semantic-layer-based data discovery.
- Strong experience in working with data science teams in refining and optimizing data science and machine learning models and algorithms.
- Demonstrated success in working with large, heterogeneous datasets to extract business value using popular data preparation tools.
- Demonstrated ability to work across multiple deployment environments including [cloud, on-premises and hybrid], multiple operating systems and through containerization techniques such as [Docker, Kubernetes].
- Interpersonal Skills and Characteristics
- Strong leadership, partnership and communication skills
- Ability to coordinate with all levels of the firm to design and deliver technical solutions to business problems
- Ability to influence without authority
- Prioritization and time management
Impress this employer describing Your skills and abilities, fill out the form below and leave Your personal touch in the presentation letter.
Job Description About Accenture: Accenture is a leading global professional services company, providing a broad range of services in strategy and consulting, interactive, technology and operations, w [...]
Job Description About Accenture: Accenture is a leading global professional services company, providing a broad range of services in strategy and consulting, interactive, technology and operations, [...]
Expert in Hybris foundational concepts , multi-channel architecture, data modeling with Hybris, Data exchange and integration, Hybris Task management, internationalization, personalization , security [...]
What You'll do: - Integration of user-facing elements developed by front-end developers with server-side logic. - Development of scalable Python/Flask server which utilizes REST API - Building [...]