You are viewing a preview of this job. Log in or register to view more details about this job.

Data Engineering Intern

Responsibilities:
  • Extract, transform, and load logic to automate data collection and manage data processes/pipelines including data quality and monitoring
  • Contribute to the development of data frameworks on cloud
  • Write and review technical documents, including requirements and design documents for existing and future data systems, as well as data standards and policies
  • Architect data pipelines
  • Collaborate with analysts, support/system engineers, and business stakeholders to ensure our data infrastructure meets constantly evolving requirements
Requirement:

  • BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience
  • Highly proficient in Java, with good knowledge of its ecosystem & solid understanding of object-oriented programming.
  • Hands on experience on Dataflow, BigQuery, Cloud SQL, BigTable, Datastore
  • Experience with data processing software (such as Hadoop, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume)
  • Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript
  • Experience managing internal or client-facing projects to completion; experience troubleshooting clients' technical issues; experience working with engineering teams, sales, services, and customers
  • Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments.
  • Experience in technical consulting
  • Experience architecting, developing software, or internet scale production-grade Data solutions on Cloud
  • Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, MongoDB, SparkML, Tensorflow)