Certain states and localities require employers to post a reasonable estimate of salary range. A reasonable estimate of the current base pay range for this position is $133,800.00 to $165,000.00 annually. Actual salary will be based on a variety of factors, including shift, location, experience, skill set, performance, licensure and certification, and business needs. The range for this position in other geographic locations may differ. Certain positions may also be eligible for variable incentive compensation, such as bonuses or commissions, that is not included in the base pay.
The well-being of WWT employees is essential. So, when it comes to our benefits package, WWT has one of the best. We offer the following benefits to all full-time employees:
- Health and Wellbeing: Health, Dental, and Vision Care, Onsite Health Centers, Employee Assistance Program, Wellness program
- Financial Benefits: Competitive pay, Profit Sharing, 401k Plan with Company Matching, Life and Disability Insurance, Tuition Reimbursement
- Paid Time Off: PTO and Sick Leave (starting at 20 days per year) & Holidays (10 per year), Parental Leave, Military Leave, Bereavement
- Additional Perks: Nursing Mothers Benefits, Voluntary Legal, Pet Insurance, Employee Discount Program
We strive to create an environment where all employees are empowered to succeed based on their skills, performance, and dedication. Our goal is to cultivate a culture of belonging that encourages innovation, collaboration, and respect for all team members, ensuring that WWT remains a great place to work for All!
If you have any questions or concerns about this posting, please email taposting@wwt.com.
#LI-AM4
#LI-REMOTE
QualificationsWhy WWT?
At World Wide Technology, we work together to make a new world happen. Our important work benefits our clients and partners as much as it does our people and communities across the globe. WWT is dedicated to achieving its mission of creating a profitable growth company that is also a Great Place to Work for All. We achieve this through our world-class culture, generous benefits and by delivering cutting-edge technology solutions for our clients.
Founded in 1990, WWT is a global technology solutions provider leading the AI and Digital Revolution. WWT combines the power of strategy, execution and partnership to accelerate digital transformational outcomes for organizations around the globe. Through its Advanced Technology Center, a collaborative ecosystem of the world’s most advanced hardware and software solutions, WWT helps clients and partners conceptualize, test and validate innovative technology solutions for the best business outcomes and then deploys them at scale through its global warehousing, distribution and integration capabilities.
With over 14,000 employees across WWT and Softchoice and more than 60 locations around the world, WWT’s culture, built on a set of core values and established leadership philosophies, has been recognized 14 years in a row by Fortune and Great Place to Work® for its unique blend of determination, innovation and creating a great place to work for all.
Want to work with highly motivated individuals on high-performance teams? Join WWT today!
We’re seeking a hands-on Data Architect to lead the design, modernization, and governance of our analytics platform on Microsoft Fabric. You will define the target architecture across OneLake, Lakehouse/Data Warehouse, Direct Lake, Power BI, and Data Engineering experiences, while orchestrating migrations and integrations from Oracle, Snowflake and other. This role blends deep technical architecture with practical delivery—partnering with data engineers, BI developers, and business stakeholders to deliver trusted, performant, and governed data products.
Key Responsibilities
Architecture & Strategy
- Define end-to-end Fabric data engineering architecture (OneLake, Lakehouse, Warehouse, Delta tables, medallion layers) aligned to business domains, enterprise architecture, platform architecture and data product strategy.
- Establish dimensional and semantic models for Power BI leveraging Direct Lake, Composite Models, and shared datasets.
- Create standards for data modeling, partitioning, indexing, and performance optimization across Fabric pipelines, notebooks, and warehouses.
- Develop reference architectures for batch, micro-batch, and streaming ingestion; choose the right pattern (Dataflows Gen2, Pipelines, Notebooks, Spark Structured Streaming).
Data Integration & Migration (Oracle & Snowflake)
- Lead migration paths from Oracle (e.g., PL/SQL-based systems) and Snowflake to Fabric Lakehouse/Warehouse; define incremental loads, CDC, and cutover strategies.
- Design robust ingestion using Snowpipe/Snowflake Tasks & Streams, Oracle CDC (e.g., GoldenGate), or landing via ADF/Fabric Pipelines to Delta Lake.
- Rationalize Snowflake objects (schemas/tables/stages) and Oracle PL/SQL logic into Spark/SQL transformations, reusable notebook patterns, and Dataflows Gen2 where appropriate.
- Implement secure, governed data sharing and zero-copy migration patterns, minimizing downtime and cost.
- Proficient in building reliable, realtime data pipelines using Kafka—covering event streaming architecture, streaming ingestion with Fabric and Spark, Kafka Connect and schema management, and the design of lowlatency processing with Kafka Streams or Spark.
Governance, Security, & Compliance
- Operationalize data catalog, lineage, classifications, policies for Fabric and connected sources.
- Define RBAC, workspace and item-level security, row-level and object-level security for BI and warehouse artifacts.
- Establish data quality rules, observability (logging/metrics), SLAs, and error handling across pipelines and streaming jobs.
- Partner with InfoSec for encryption, key management, and compliance (e.g., HIPAA/PCI/SOX depending on industry).
Performance, Reliability & Cost Management
- Optimize Direct Lake vs Import vs DirectQuery decisions, tune warehouse compute, cache, and layout.
- Optimize Spark jobs (partitioning, broadcast joins, caching), Delta Lake management (Z-order, vacuum), and Power BI model performance.
- Implement cost visibility and FinOps practices across Fabric capacities, Snowflake virtual warehouses, and Oracle licensing impacts.
Delivery & Leadership
- Translate business requirements into data product backlogs, architecture epics, and release plans.
- Provide hands-on guidance to data engineers and BI developers; perform code and design reviews.
- Evangelize best practices and fabric-native approaches; create architectural runway for future features.
- Collaborate with product owners, analytics leads, and enterprise architecture to ensure alignment and reuse.
Required Qualifications
- 10+ years in data engineering/architecture; 3+ years leading modern cloud data platforms.
- Deep expertise in Microsoft Fabric, including:
- OneLake, Lakehouse (Delta), Warehouse, Data Engineering (Spark notebooks), Data Factory (Pipelines), Dataflows Gen2, Power BI (Direct Lake, Composite Models), and Git integration.
- Strong Oracle experience: PL/SQL, schema design, performance tuning, partitioning, Oracle CDC tooling (e.g., GoldenGate), and migration to cloud data lakes/warehouses.
- Strong Snowflake experience: virtual warehouses, Time Travel, zero-copy cloning, Streams & Tasks, Snowpipe, role-based access control, data sharing, and performance tuning.
- Expertise in Delta Lake, Spark SQL/PySpark, SQL (analytical functions), data modeling (dimensional/star, data vault, semantic modeling).
- Hands-on experience with Azure services: ADLS/OneLake, Key Vault, Azure AD, Event Hubs/Kafka, Functions/Logic Apps (nice to have).
- Proven track record with data governance, catalog/lineage, security policies, and compliance.
- Strong communication skills with the ability to lead design sessions and influence senior stakeholders.
Preferred Qualifications
- Experience with policy authoring and lineage across Fabric, Snowflake, and Oracle sources.
- Familiarity with Databricks or Synapse (for comparison/interop) and migration trade-offs to Fabric.
- Experience implementing streaming architectures (e.g., IoT/real-time analytics).
- Background in domain-driven design and data product operating models.
- Certifications (nice to have):
- Microsoft Certified: Fabric Analytics Engineer Associate
- Azure Data Engineer Associate (DP-203)
- Snowflake SnowPro Core
- Oracle Database certifications
- TOGAF Certifications
Core Competencies
- Architectural Thinking: Systems design, trade-off analysis, future-proofing.
- Engineering Excellence: Code quality, testing, CI/CD, reproducible pipelines.
- Data Governance & Security: Policy-driven controls and auditable lineage.
- Business Acumen: Translate business questions to measurable data solutions.
- Collaboration & Leadership: Guide teams, mentor engineers, and drive consensus.
Tools & Technologies
- Microsoft Fabric: OneLake, Lakehouse/Warehouse, Data Engineering (Spark), Pipelines, Dataflows Gen2, Power BI.
- Oracle: PL/SQL, partitioning, performance tuning, GoldenGate/CDC, ODI.
- Snowflake: Warehouses, Tasks/Streams, Snowpipe, RBAC, data sharing, performance tuning.
- Programming: SQL, PySpark, Python, groovy.
- DevOps: Git, CI/CD, Azure Dev Ops environment promotion, testing frameworks.
- Governance: lineage, catalog, classifications, policies.
- Observability: Logging/monitoring, job metrics, data quality tooling.
Success Metrics (First 12 Months)
- Target Architecture Delivered: Fabric reference architecture documented and adopted.
- Migration Milestones: Priority Oracle/Snowflake workloads landed in OneLake/Delta with validated parity and SLAs.
- Performance Gains: Measurable improvement in model refresh times, query latency, and pipeline reliability.
- Governance Operationalized: lineage coverage, policy enforcement, and RLS/OLS in production.
- Cost Optimization: Capacity right-sizing and workload scheduling with quarterly savings targets