Data Engineer
Today
F-Secure
Nov. 30, 2025
Helsinki
At WithSecure™, we protect businesses all over the world. Our SaaS solutions safeguard against modern cyber threats, and our innovative Co-security approach reflects our belief that true protection requires collaboration and shared expertise. No one can solve every cyber security problem alone. Our vision is to become Europe’s flagship in cyber security. Every day, our talented teams work to prevent cyber extortion, secure critical infrastructure, and prevent misuse of sensitive data. At WithSecure, it’s our people who make us exceptional – a diverse community that values passion, purpose, and a commitment to workplace well-being. If you’re ready to make an impact with a company that’s transforming cybersecurity, we’d love to hear from you.
Role Overview:
We are looking for a forward-thinking data engineering expert to design and maintain robust, scalable data platforms that power AI-driven solutions. This role requires a blend of technical depth, business understanding, and a consultative approach to enable enterprise-wide data democratization and AI adoption.
Key Responsibilities:
- Architect and implement modern data platforms leveraging Databricks and AWS for large-scale data processing and analytics.
- Design and optimize data pipelines for ingestion, transformation, and storage of structured and unstructured data.
- Enable AI-native solutions by ensuring data readiness for Generative AI and advanced analytics use cases.
- Collaborate with cross-functional teams to align data engineering practices with business objectives and AI strategy.
- Implement governance, lineage, and security controls to ensure compliance and trust in data assets.
- Drive performance optimization and cost efficiency across cloud-based data environments.
- Act as a trusted advisor, guiding stakeholders on data architecture best practices and emerging trends.
Required Skills & Experience:
- Proven experience in building and managing data platforms at enterprise scale.
- Expertise in Databricks (Delta Lake, Medallion Architecture) and AWS services (S3, Glue, Lambda, EMR).
- Strong proficiency in SQL, Python, and distributed data processing frameworks (Spark).
- Familiarity with data governance, lineage tools, and security best practices.
- Experience enabling AI/ML workflows, including Generative AI data preparation.
- Business acumen: ability to translate business needs into data solutions.
- Consultative mindset: skilled in stakeholder engagement and solution co-creation.
