You are viewing a preview of this job. Log in or register to view more details about this job.

Junior Developer - Data Infrastructure Engineering

About Akuna:
Akuna Capital is a young and booming trading firm with a strong focus on collaboration, cutting-edge technology, data driven solutions and automation. We specialize in providing liquidity as an options market maker – meaning we are committed to providing competitive quotes that we are willing to both buy and sell. To do this successfully we design and implement our own low latency technologies, trading strategies and mathematical models.

Our Founding Partners, including Akuna's CEO Andrew Killion, first conceptualized Akuna in their hometown of Sydney. They opened the firm’s first office in 2011 in the heart of the derivatives industry and the options capital of the world – Chicago. Today, Akuna is proud to operate from additional offices in Sydney, Shanghai, and Boston.
What you’ll do as a Junior Developer on the Data Infrastructure Team at Akuna:
Akuna Capital is seeking Junior Developers to join our Data Infrastructure Team in our Chicago office. At Akuna, we believe that our data provides a key competitive advantage and is a critical part to the success of our business. Our Data Infrastructure team is composed of world class talent and has been entrusted with the responsibility of building and maintaining our data pipelines. Our pipelines start with gathering data from disparate sources across the globe and end with the generation of complex datasets used by our Trading, Quantitative and Support staff. Along the way, we build tools to access, validate and monitor the data in efficient and intuitive ways. At each step of our pipeline we choose the best tools and technologies available. In this role you will:
  • Work closely with Developers, Quants, Traders and Support personnel to create and maintain datasets that play a key role in our business
  • Design and build Scala/Python applications that provide performant access to our data
  • Monitor and validate our data pipelines by building monitoring and analysis tools
  • Gain exposure to the financial markets and trading through the development and use of our datasets
  • Evaluate and select open source or proprietary tools required to meet our data requirements
Qualities that make great candidates:
  • Bachelors, Masters or PhD in technical field – Computer Science/Engineering, Math, Physics or equivalent completed upon employment
  • Graduating by August 2022 or prior
  • Demonstrated experience developing software on Linux using Java and/or Scala
  • Familiarity with Docker, Kubernetes, Kafka, Flink/Spark/Kafka-Streams, or cloud technologies is a plus
  • Ability to communicate complex technical topics in a clear and concise ways
  • Passionate, pragmatic problem solver able to independently pursue solutions to complex problems
  • Understanding of the core concepts of distributing computing, database design, and data storage
  • Legal authorization to work in the U.S. is required on the first day of employment including F-1 students using OPT or STEM
 
Please note: If you have applied to multiple roles, you will be asked to complete multiple coding challenges and interviews.