Our client is an international company that builds integrated programs for a healthy lifestyle and body weight control. The company has been operating for more than 50 years, and has offices in 30 countries. Off-line group meetings combined with modern technologies help users follow individually designed and convenient programs.
DataArt's specialists will work on creating data lake-based solutions using GCP, S3, CCS, BigQuery, Kinesis + Looker, Apache Beam, Spark, Docker, and Kubernetes.
We are not hiring the specialist just for this project, but also for one of the DataArt companies. When the project is over, or if after some time you no longer are satisfied by the project, you can discuss transitioning to another project with the managers.
Working with Product Owners to understand business requirements and use cases.
Designing an architecture (logical and physical) for technical and business solutions.
Solving issues that arise while designing and implementing the architecture.
Assessing the relevance of tools for the use case, comparing tools/providers and making recommendations.
Review of database schemas, data models, and architecture data.
Prototyping solutions for defined use cases.
Consulting colleagues and business partners with technical questions.
Required Skills and Experience:
Excellent knowledge of Java, Scala, Python
Experience in architecture design for large expandable solutions in data management data processing, data architecture or analytics.
Understanding of the best practices of Data Lake and analytical architecture design.
Hands-on experience with Spark, Apache Beam, Airflow, Docker, Kubernetes.
Development experience in Amazon Web Service environment (Kinesis + Looker, Kinesis Firehose, Lambda, S3, IAM, Athena).
Development experience in Google Cloud Platform environment (DataFlow, DataProc, DataStudio, Google Storage Transfer, Google Storage, IAM).
Practical experience in large env