Big Data Developer
Big Data Developer
Big Data Developer
Coherent Solutions, UAB
Big Data Developer
Big Data Developer

Big Data DeveloperCoherent Solutions, UAB

About Client

Our client is the company which pioneers the future of cross-platform media measurement, arming organizations with the insights they need to make decisions with confidence. Central to this aim are our people who work together to simplify the complex on behalf of our clients & partners.

It is a trusted partner for planning, transacting and evaluating media across platforms. With a data footprint that combines digital, linear TV, over-the-top and theatrical viewership intelligence with advanced audience insights, its platform allows media buyers and sellers to quantify their multiscreen behavior and make business decisions with confidence.

About Project


We collect and process 60+ billion events each day, have 10s of petabytes online and each month our processes read nearly an exabyte. We use this capability to provide our clients with deep insights that no other company can match. Data Engineering is the team responsible for managing this vast dataset.
Essential Duties and Responsibilities:
· We build data processing pipelines that handle 100+ terabyte datasets
· We automate as much as we can so that we can stay focused on writing code
· We troubleshoot and quickly resolve issues
· We work with Analysts and Data Scientists to design and implement new methodology.

As Big Data Developer you will: 


· Design and maintain big data pipelines, web & engineering solutions capable of running efficiently and robustly at petabyte scale.
· Work within an agile team to develop new ETL processes and DW design.
· Recommend and implement creative solutions for performance improvement.
· Increase scalability and maintainability to support rapid usage growth.
· Collaborate openly with stakeholders and clients to continuously improve the data usage experience.

We expect you:

· have Bachelor’s degree in computer science or a related field.
· have 2-4 years of experience in software development including design, coding, and testing.
· have 2+ years of experience developing with Java and/or Scala.
· have 2+ years of experience with Hadoop, Apache Spark framework.
· have experience with AWS cloud services like EC2, ECS, EMR, Redshift, RDS, Lambda.
· can demonstrate strong SQL skills.
· have knowledge of Python and workflow management tools like Airflow (advantage).

It would be great if:

· You treat performance as a feature and not an afterthought.
· You have experience writing analytical queries that run on MPP databases.
· You are comfortable reading query execution plans.
· You can describe multiple MapReduce join strategies and their tradeoffs.
· You have experience scaling Machine Learning algorithms in a distributed environment.

We offer

You will be rewarded with the salary starting from 3000 EUR gross monthly and final offer would depend on your experience and aligned responsibilities. Our employee benefits are oriented to the wellbeing and security of our employees. The success of our company - is our people, therefore it is of the utmost importance that they feel good, so to support that additionally we offer Private Health Insurance, Life and accident investment insurance, additional leave days, opportunity to work from anywhere, workations, free lunches, language courses, company events and team building activities and much more!

Daugiau jūsų paiešką atitinkančių rezultatų:

Group Database analyst
Alliance for Recruitment
Vilnius
€ 3000 – 4000
Galiojimo pabaiga: 2022.02.03
Data engineer with a DevOps aptitude in Global Insights team
Danske Bank Lithuania
Vilnius
€ 2960 – 4440
Galiojimo pabaiga: 2022.02.21
Data Analyst (Consuting Analytics)
Euromonitor International
Vilnius
€ 1700 – 2300
Galiojimo pabaiga: 2022.02.06