Databricks Migration – FREELANCE / VAT-registered contractor / Piva (Full remote)
For one of our major global clients in the digital solutions development sector, we are looking for:
Databricks Migration FREELANCEFull remote – occasional travel to
Florence (Italy)
may be required for major brainstorming sessionsStart Date: ASAP⏳ Until December 2026Full timeEnglish speaker / native / bilingual
Main responsibilitiesThe candidate will be part of a Data team and will be responsible for Databricks migration.
Technical Skills:
Databricks:
Deep hands-on experience and technical fluency withDatabricks including:• setting up & configuring Databricks workspaces• creating metastores, schemas, catalogs with Unity Catalog• creating RBAC, configuring scalable roles, groups and policies• building pipelines & workflows using PySparkDelta Lake and Unity Catalog (beyond Notebooks)• choosing the right clusters, configurations, run-times for the rightworkloads• performance tuning complex pipelines and optimizing theworkflows
Advanced SQL & Data Modeling:
Expert-level SQL skills (optimization,indexing) and a mastery of data modeling techniques (Dimensional, StarSchema, Snowflake) and ability to translate them into modern Big DataSQL / Databricks SQL.
Cloud & Modern Data Stack:
Proficiency in working with
AWS ,
Airflowand dbt.
Database Diversity:
Proven experience in handling and workingefficiently across various RDBMS platforms (PostgreSQL, SQL Server,Oracle, MySQL).
Non-Technical skills:
Effective Communicator:
Possesses the ability to translate complextechnical concepts into clear business value for non-technicalstakeholders and aligns effectively with all involved parties proactively.Strategic Thinker:
Not just executes; but self-leads projects fromideation to actionable solution, balancing technical debt with rapiddelivery.Problem Solver:
Thrives in fast-paced, ambiguous environments, findssolutions for problems proactively, proposes options and aligns withteams, pays meticulous attention to detail
SECONDARY:
DevOps & Engineering:
Strong command of Docker/Kubernetes andwell versed in implementing and releasing production artefacts withCI/CD best practices (preferably with Gitlab).Governance:
Experience with data versioning, schema evolution, anddistributed metadata management.
For one of our major global clients in the digital solutions development sector, we are looking for:
Databricks Migration FREELANCEFull remote – occasional travel to
Florence (Italy)
may be required for major brainstorming sessionsStart Date: ASAP⏳ Until December 2026Full timeEnglish speaker / native / bilingual
Main responsibilitiesThe candidate will be part of a Data team and will be responsible for Databricks migration.
Technical Skills:
Databricks:
Deep hands-on experience and technical fluency withDatabricks including:• setting up & configuring Databricks workspaces• creating metastores, schemas, catalogs with Unity Catalog• creating RBAC, configuring scalable roles, groups and policies• building pipelines & workflows using PySparkDelta Lake and Unity Catalog (beyond Notebooks)• choosing the right clusters, configurations, run-times for the rightworkloads• performance tuning complex pipelines and optimizing theworkflows
Advanced SQL & Data Modeling:
Expert-level SQL skills (optimization,indexing) and a mastery of data modeling techniques (Dimensional, StarSchema, Snowflake) and ability to translate them into modern Big DataSQL / Databricks SQL.
Cloud & Modern Data Stack:
Proficiency in working with
AWS ,
Airflowand dbt.
Database Diversity:
Proven experience in handling and workingefficiently across various RDBMS platforms (PostgreSQL, SQL Server,Oracle, MySQL).
Non-Technical skills:
Effective Communicator:
Possesses the ability to translate complextechnical concepts into clear business value for non-technicalstakeholders and aligns effectively with all involved parties proactively.Strategic Thinker:
Not just executes; but self-leads projects fromideation to actionable solution, balancing technical debt with rapiddelivery.Problem Solver:
Thrives in fast-paced, ambiguous environments, findssolutions for problems proactively, proposes options and aligns withteams, pays meticulous attention to detail
SECONDARY:
DevOps & Engineering:
Strong command of Docker/Kubernetes andwell versed in implementing and releasing production artefacts withCI/CD best practices (preferably with Gitlab).Governance:
Experience with data versioning, schema evolution, anddistributed metadata management.