|Job ID||Job Title||Job Responsibilities|
|EAD01||ETL Application Developer||• Perform requirement analysis and impact analysis for the Business requirements, translating requirements to technical specifications documents and prepare Level of Effort document to ensure the end-to-end activities planned are accomplished as per client needs.
• Develop and deploy big data code to ingest external data from source systems to Enterprise Data warehouse using BDM, Stream Sets. Standardize the ingested data into Source Standardizing zone for detailed processing using Big Data Technologies including Informatica BDM, SPARK-SCALA and PYSPARK scripts.
• Design the developments and enhancements of the source system processes for data loading into Edward. Maintain/enhance the existing feeds to using technologies like BDM, Teradata, ETL/Informatica, Control-M. Diagnose software issues, solving deep multidimensional problems and improve performance of Enterprise Datawarehouse.
• Perform production deployment. Identify the long-running production jobs and fine tune them to improve job performance.
• Conduct code and design review sessions to ensure quality. Debug errors from Job Abends related to Edward Data ingestion/analytical processing.
• Will work in Research Triangle Park, NC and/or various client sites throughout the U.S. Must be willing to travel and/or relocate.
• Minimum requirements : Must have a Bachelor’s Degree in Computer Science, Computer Engineering or related field and one year experience working in ETL, Control-M and Spark. Must be willing to travel and/or relocate.