Job #2338: Chameleon Technologies is seeking a Data Engineer III for our Corporate Data and Analytics team, you will work on fast and streaming data to enable analytics and data science which allows for the creation of nearly instantaneous personalization opportunities and highly informed decisions. You will be working with Tier 1 data to create, modify, and test the code, forms, and scripts that construct the data sets that drive our analytic capabilities.


  • Solve business and data engineering problems using data centric programming and scripting skills to create data models and pipelines
  • Consult with the business to create understanding of the needs, pace and direction for our business partners, translate these needs into requirements and specifications, and maintain contact with the customers through project completion
  • Lead all phases of architecture, conceptualization, design, development, testing, and production support data solutions
  • Architect, design, and build best-of-class production processes that ensure security, efficiency, and availability of analytic tools and data
  • Collaborate with data scientists and analyst to further understand business problems
  • Program and manage APIs for data exchange
  • Lead and conduct unit and system testing to ensure design is still relevant and implementation is producing a useful, maintainable, reliable product
  • Mentor and train other team members by introducing them to new technologies, methods and learning resources
  • Act as a resource to technical community as well as business partners


  • Bachelor’s degree in computer science, computer engineering, or similar area
  • 5 years’ relevant experience in data integration, design and management
  • Experience with the following as they apply to our source/targets (Teradata, Netezza, Azure, Amazon Redshift) experience with the following:

    o SQL

    o ETL tools

    o Scripting language

  • 3 years’ experience in data integration, design, and/or management using NoSQL/Hadoop would be nice to have
  • Experience providing data integration services within healthcare organizations
  • Strong understanding of change data capture (CDC) methodology and working experience with one of the data replication tools (Attunity Replicate, Oracle Goldengate etc.)
  • Significant industry knowledge of healthcare specific regulatory requirements for data management and knowledge of health insurance concepts and terms
  • Ability to lead and participate in analytic projects
  • Advanced analytical and problem solving skills and data processing programming skills across SQL-based and Hadoop-based technologies
  • Strong understanding of web services
  • Excellent problem definition, problem-solving, and technical writing skills
  • Strong written and verbal communication skills and ability to deal with management of various levels

Apply for this Now

Apply Here

Got Questions?
We've got answers!

Have a Question?