The purpose of the job is to design and develop ETL solutions for Global Liquidity Reporting System (GLRS). The candidate will be responsible for understanding the requirements, designing, coding and testing along with the development team. This will involve working closely with the business and SMEs to prioritize business requests, manage the Batch Processing development work slate, provide estimate efforts, and ensure timely delivery on committed items and to project manage all aspects of software development according to the Software Development Lifecycle (SDLC).
The Global Liquidity Reporting System (GLRS)
application standardized and consolidated liquidity reporting solution to global treasury users. GLRS application consumes data from GENESIS and ADS/SDRs to produce various liquidity reports for FRB and PRA agencies. GLRS Reports used by Treasury users to analyze firm liquidity status to take internal funding decisions as well as report to external agencies.
The key responsibilities for the candidate are -
Develop and design highly efficient ETL solutions based on the business requirements and aggressive delivery timelines. Experience with Hadoop/Spark/Sqoop/Hive (Or willingness to learn backed with solid technology understanding of distributed systems) is essential. Understanding Business and Functional Requirements provided by Business Analysts and to convert into Technical Design Documents to deliver on the requirements. Ensure best practices are followed. Prepare detailed level test plans and ensure proper testing for each module developed. Prepare handover documents; manage SIT with oversight of UAT and Production Implementation. Identify and proactively resolve issues that could affect system performance, reliability, and usability. Ensure process compliance and manage expectations of the leadership Explore existing application systems, determines areas of complexity, potential risks to successful implementation. Build and Provision Big Data Cluster for SIT / UAT Environments and Design end-to-end big data pipeline and testing. Extracting and analysing the sample data from operational systems and ingesting into big data Hadoop platform to validate the user requirements to create high-level design documents. Willing to work flexible hours.
6 to 8 years of experience in Software development
Must have 3 years experience in Architecture, Design and Development of Big data platforms, Hadoop, Apache Spark, Hive, Hue, Query IT, Avro, Parquet, Cloudera , Data-mere, Arcadia.
5 years of experience in big data ecosystem Hadoop, Spark & storm, Elastic Search, Apache Nifi, HBase, Cassandra and MongoDB.
Or must have 5 years of Ab Initio ETL and Data Management solutions including ability to understand the internal working of variety of Ab Initio components.
Good hands on Experience in Java and Scala Programming Languages.
Expertise in building Big Data applications using Hadoop and Spark.
Good knowledge of Pentaho reporting tool and integration of Hadoop Eco System components.
Strong Experience in NoSQL Data Store design and implementation.
Admin knowledge is the plus.
Good experience in UNIX shell scripting and automation process.
Strong Experience in Big Data Job Scheduling Oozie.
Strong design and execution bend of mind
Candidate should possess a strong work ethic, good interpersonal, communication skills, and a high energy level.
Candidate should share many common traits: Analytical thinker, quick learner who is capable of organizing and structuring information effectively;
Ability to prioritize and manage schedules under tight, fixed deadlines.
Excellent written and verbal communication skills.
Ability to build relationships at all levels.
Ability to independently work with vendors in resolving issues and developing solutions
Strong interpersonal skills
Bachelor of Science or Master degree in Computer Science or Engineering or related discipline.
Strong work organization and prioritization capabilities.
Takes ownership and accountability for assigned work.
Ability to manage multiple activities.
Focused and determined in getting the job done right.
Ability to identify and manage key risks and issues.
Shows drive, integrity, sound judgment, adaptability, creativity, self-awareness and an ability to multitask and prioritize.
Good change management discipline
Impress this employer describing Your skills and abilities, fill out the form below and leave Your personal touch in the presentation letter.
Job Profile Job Description : Job Description Roles and Responsibilities 5-yrs exp. candidate Candidate shall Have minimum 5- Years of experience.1-3 years kafka exp is mandatory Demonstrab [...]
- Project Role : Application Developer - Project Role Description : Design, build and configure applications to meet business process and application requirements. - Management Level : 9 - [...]
- Project Role : Application Developer - Project Role Description : Design, build and configure applications to meet business process and application requirements. - Management Level : 10 - Work [...]
Join us as we work to create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. We are looking for a Senior Member of Technical Staff to join our Collect [...]