Citi is looking for a Developer who will help design and develop an enterprise level data and analytics platform. He/she will be part of a team building a scalable platform used by our clients for discovery, access and analytical processing of data. The platform will support batch and real time analytics and expose the capabilities through a set of APIs. The platform will also provide sandboxing capabilities for the clients to further process the data
The candidate will play a central role in development of the scalable data platform used for discovery, access and analytics on cross business data
The members will be part of a global team with presence in the US, Warsaw and many other Citi locations we have.
Requires good analytical skills in order to filter, prioritize and validate potentially complex material from multiple sources
Citi also have strict coding and engineering standards to follow from following proper unit testing to continuous integration. The candidate should be familiar with the tools needed here
Get the chance to know whole business flow of Investment Bank
Key Responsibilities: –
Develop and maintain financial software applications. –
Take responsibility for timeline and quality of the delivered solution –
Be responsible for design and development of key components in the big-data analytics platform. –
Build good and healthy relationships with clients and other teams –
Follow Citi’s engineering standards for different phases of software development
3+ years programming experience in Java, good understanding of Java core ·
Excellent communication skill and interpersonal skill ·
Proficient in both spoken and written English ·
Banking and securities domain knowledge would be an added advantage Skills Required
Programming experience in one or more application or systems languages (Python, Java,Scala, etc).
? Have a significant background in functional programming, preferably Scala or Python.
? Distributed Systems Design Experience - including understanding of distributed systems concepts and principles
? Hadoop Ecosystem of Tools (Spark, Hive, Impala, MapReduce, etc).
? Experience extending and implementing core functionality and libraries in data processing platforms (Hive/Pig UDFs, Spark / Spark SQL, etc)
? Strong understanding of different storage architectures and their appropriate application.
? Database Performance concepts like indices, segmentation, projections, and partitions.
? A commitment to writing understandable, maintainable, and reusable software.
? Willingness to learn new languages and methodologies.
? Experience working with business partners and engineers to gather, understand, and bridge definitions and requirements.
? An innate desire to deliver and a strong sense of accountability for your work. Qualifications: ·
Bachelor’s degree (in science, computers, information technology or engineering)