COMPANY DESCRIPTION: CACI serves the U.S. government by accelerating the transformation of health-related services through the application of new healthcare strategies and innovative technologies. We provide healthcare subject matter expertise, software development, systems integration and other IT-related services to the Department of Health and Human Services (HHS), the Department of Veterans Affairs, and the Department of Defense Military Health System. Our HHS customers include the Centers for Medicare & Medicaid Services, the Centers for Disease Control and Prevention, the National Institutes of Health, and the Federal Drug Administration. We provide comprehensive support throughout Federal Healthcare, including benefits and payer services, public health, healthcare delivery systems and medical logistics programs. CACI teams bring in-depth, first-hand understanding of payers, providers and patients to every project, to provide efficient and effective customer-centric solutions and services. CACI employs a diverse range of talent to create an environment that fuels innovation and fosters continuous improvement and success. Join CACI, where you will have the opportunity to make an immediate impact by providing information solutions and services in support of critical national missions. A member of the Fortune 1000 Largest Companies and the Russell 2000 Index, CACI provides dynamic careers for approximately 15,000 employees working in over 120 offices worldwide. POSITION SUMMARY: Hadoop Engineer -“ Level 2 plays a lead role in analyzing various data sources and developing and enhancing data streams for a large Enterprise Data Warehouse. The Hadoop Engineer shall be responsible for installing and configuring Big Data tools and developing ETL processes in Hadoop Platform. The successful candidate must have at least 5-7 years of experience in a large Data Warehouse environment preferably with Cloudera Hadoop distribution. RESPONSIBILITIES: - Enhances traditional data warehouse environment with Hadoop and other next generation Big Data tools. - Provides expertise on database design for large, complex database systems using a variety of database technologies. - Installs and configures Big Data servers, tools and database. - Develops ETL requirements for extracting, transforming and loading data into the Data Warehouse. - Creates Interface Control Document (ICD) for the new data streams - Creates ETL functional specifications that document source to target data mappings - Coordinates and collaborates with end users and business analysts in identifying, developing and validating ETL requirements. EDUCATION & EXPERIENCE - Requires a bachelor‘s degree or equivalent - Requires at least 5-7 years of experience in a large Data Warehouse environment using Hadoop, HBase, Hive, Impala, Spark, Pig, Sqoop, Flume and/or MapReduce - Sound knowledge of Relational Databases (RDBMS), SQL & No-SQL databases - Exposure to Teradata Data Warehouse environment - Data modeling and database design experience. - Experience in providing IT applications development and systems implementation services to federal customers.