Neváhajte a pošlite životopis! O prácu sa uchádzajú aj kandidáti, ktorí nespĺňajú úplne všetky požiadavky.
Salary1 800 EUR/monthSalary for mid-level role starting at 1800 EUR and 2600 EUR for senior role. Salaries vary based on relevant experience and skills. 13th salary bonus based on individual and company performance.
About your roleFor nearly 15 years, we’ve partnered with the biggest tech brands to get their ideas to market faster. Now, we’re in the midst of building the foundation for our next phase of growth, as we have just completed the acquisition with a fast growing healthtech company GoHealth, a leader in providing technical solutions in health insurance mediation. We have a robust digital platform that covers the entire life cycle of customers and helps millions of customers access affordable health insurance. We work with the latest enterprise technology and tools, prize collaboration in all that we do, and push the boundaries with every new project.
GoHealth is looking for Data Engineers to become part of our team. We produce a lot of data every day, and we need you to help us build systems that can process that data, providing every part of our business the data it needs to get their job done. Our technology always plays a key role in our growth, and we are constantly modernizing our technological stack in a very thoughtful, business-driven way. If you have relevant experience with BigData frameworks/projects, data streaming, batch processing and if you feel excited about working with lots of data, that's a perfect fit for you!
Your day as a Data Engineer:
Let's start your day coding a little bit, after all, that's our passion. You have to automate a pipeline using batch and streaming services to be able to get our data where it needs to go. After a few hours with total focus on your new project, you'll have a small break to eat some chocolate and play video games with your team. In a hurry, because the next knowledge sharing session is about to start, and oh, you're the speaker! After spreading the knowledge, quick stop for a perfect lunch with your new friends. Can't wait to go back and code a bit more! By the way, that's the first time you are working with something like this and StackOverflow was not able to answer your questions. Let's discuss it with our mentor. After getting unblocked and coding a bit more, let's get a break and stop in the meeting room for our daily with Slovak and American folks to share our updates. Maybe you still don’t know, but we utilize Kanban sessions as part of the process. But OH NO, you just realized a few new tickets are on your plate, so let's go for a meeting and discuss it. At the end of your day, let's go grab a beer with our American colleagues. They are here as part of our exchange program. You love this start-up-family-like energy.
● You work with stream-processing, batch processing projects/frameworks/tools to create data workflows
● You write complex SQL queries and work with NoSQL databases
● You perform unit testing, system integration testing and assist with user acceptance testing
● You create and maintain detailed documentation of the technical design, operational support and maintenance procedures for all data pipeline tasks
● You ensure data quality and compliance with development, architecture, reporting, and regulatory standards throughout the entire data pipeline
● You collaborate with the rest of the Engineering Team, subject matter experts and department leaders to understand, analyze, build and deliver the data they need to power the business
● Bachelor’s Degree in Computer Science or equivalent experience required
● 3+ years of experience in designing and software development
● Hands on experience in any Object Oriented Language with the preference in Java or Python
● Experience with Data stream processing
● Knowledge and experience working with a variety of data stores and formats (Relational, Non-relational, Flat, CSV, Excel).
● Written and spoken communication skills in English (B2 category)
● Experience consuming data from web services, SOAP and REST technologies, HTML, XML and JSON
● Experience with a streaming platform such as Apache Kafka or AWS Kinesis
● Experience with a batch data processing and workflow orchestrator, such as Airflow, Luigi, Pinball, Chronos, or similar
● Experience with different kinds of databases: SQL (MySQL, SQL Server), NoSQL (MongoDB, DynamoDB, HBase), columnar (Redshift, Parquet), memory (Redis)
● Experience with Amazon Web Services
If you want more
And you are ready to play an important role in an upwardly mobile, innovative product development company, this is your opportunity to leverage and grow your unique strengths, talents and skills.
than “just a job”.