Data Architect
We are seeking an experienced a Experinced Data Architect to lead the design and implementation of our fintech data infrastructure. In this role, you will define and build scalable data architectures spanning transactional (OLTP) systems and analytical (OLAP) platforms. You will focus on data modeling, data warehousing, and distributed data systems to ensure our financial data is structured for high performance and reliability. The ideal candidate has a proven track record in real-time data streaming and large-scale databases, ensuring data quality, security, and compliance in a fast-paced financial technology environment.
Responsibilities
- Architect and Implement Data Solutions: Design and implement comprehensive data architecture solutions to support fintech applications and analytics. Develop and maintain enterprise data models (conceptual, logical, physical) that ensure consistency and data quality across systems.
- Data Warehouse & Lake Management: Build and oversee robust data warehouses and data lakes to consolidate disparate data sources. Optimize data storage and retrieval strategies (e.g. schema design, indexing) for high-volume analytics and reporting workloads. Ensure data lakehouse architectures (e.g. Delta Lake) are effectively utilized for both structured and unstructured data.
- Distributed Systems Design: Develop a scalable distributed data architecture that separates transactional processing and analytical workloads, ensuring each is tuned for its purpose (operational databases vs. analytics databases). Leverage cloud-native data services to handle growth and maintain low-latency performance for mission-critical fintech operations.
- Real-Time Data Streaming: Enable and enhance real-time data processing capabilities. Design streaming data pipelines using platforms like Apache Kafka, Apache Flink, Spark Streaming, or NATS to ingest and process financial transactions and events in real-time. Ensure near-instant data availability for time-sensitive applications.
- Data Integration & ETL Pipelines: Oversee end-to-end data integration (ETL/ELT) processes, ensuring seamless data flow between internal systems and external partners. Implement data transformation pipelines that handle both batch and streaming data to feed downstream applications and analytics efficiently.
- Data Governance and Quality: Establish and enforce data governance standards and best practices. Implement measures for data quality, data lineage, and master data management to ensure the accuracy, consistency, and security of sensitive financial data. Ensure compliance with relevant regulations and standards for data in the financial industry (e.g., ensuring PCI compliance for payment data).
- Performance Tuning and Optimization: Monitor and optimize database performance across various data stores (SQL and NoSQL). Identify and resolve bottlenecks in query performance, indexing, and storage to handle high volumes, velocities, and varieties of data. Refactor existing database schemas and queries to improve throughput and scalability as the platform grows.
- Technical Leadership: Provide technical leadership and guidance to data engineering teams and other developers. Collaborate with cross-functional teams (engineering, analytics, product) to translate business requirements into technical data solutions. Lead architecture reviews, set data standards and reference architectures, and mentor junior engineers on best practices in data architecture.
- Innovation and Best Practices: Stay current with emerging data technologies and industry trends. Evaluate new tools and frameworks (e.g. evolving big data platforms or streaming technologies) for potential adoption. Continuously improve the data architecture by incorporating best-of-breed solutions to enhance scalability, reliability, and maintainability of the fintech data platform.
Required Skills
- Extensive Experience: 8+ years of experience in data architecture, data engineering, or related roles, with a strong history of designing complex data systems. (Financial services or fintech industry experience is highly valued.)
- Data Modeling Expertise: Deep expertise in data modeling and database design. Ability to create and maintain ER diagrams, relational schemas, dimensional models (star/snowflake schemas), and other modeling artifacts that accurately represent business data. Experience establishing data standards for how data is defined and organized across the enterprise.
- Data Warehousing & Lakehouse: Strong knowledge of data warehousing concepts and architecture (OLAP systems, MPP databases). Hands-on experience building or using modern data warehouse platforms such as Snowflake, Google BigQuery, Amazon Redshift, or Delta Lake for large-scale analytics. Familiarity with designing data lakes and lakehouse architectures to manage raw and processed data.
- Distributed Systems (OLTP/OLAP): Solid understanding of distributed system design for both transactional (OLTP) and analytical (OLAP) workloads. Proven ability to architect solutions that handle high concurrent transactions and large analytical queries in parallel. This includes knowledge of concepts like sharding, replication, and eventual consistency in data stores.
- Relational & NoSQL Databases: Proficiency in working with both relational databases and NoSQL databases. Strong SQL skills and experience with databases such as PostgreSQL, MySQL, Oracle, as well as NoSQL stores like MongoDB and Cassandra. Capable of designing schema and indexing strategies tailored to each type of data store for optimal performance and reliability.
- Streaming & Big Data Technologies: Hands-on experience with real-time streaming and big data processing frameworks, such as Apache Kafka, Apache Flink, Apache Spark (including Spark Streaming), or NATS messaging. Ability to design data pipelines that handle both batch and streaming data, and integrate streaming data with enterprise data platforms.
- ETL/ELT & Data Integration: Strong background in data integration techniques and tools. Experience designing and optimizing ETL/ELT processes to efficiently move and transform large data sets between operational systems, data warehouses, and data lakes. Familiarity with data integration frameworks or tools (e.g. Apache NiFi, AWS Glue, Talend) is a plus.
- Data Governance & Security: In-depth understanding of data governance principles and practices, including data quality management, metadata management, and regulatory compliance requirements. Knowledge of how to secure sensitive data (encryption, access control) and ensure compliance with financial data regulations.
- Cloud Platforms: Experience deploying and managing data infrastructure in cloud environments. Familiarity with cloud data services on AWS, GCP, or Azure (such as AWS RDS/Redshift, Google BigQuery, Azure Synapse, etc.) and their ecosystem for storage and analytics. Ability to leverage cloud scalability and tools for data processing (e.g. AWS Lambda, Databricks on Azure).
- Programming and Scripting: Competency in programming or scripting for data tasks, such as Python, Java/Scala, or SQL scripting. Ability to automate data processing steps and write custom data transformations. (Experience with distributed computing paradigms like MapReduce or Spark is a plus.)
- Analytical Mindset: Excellent problem-solving skills with the ability to analyze complex data issues and performance problems. Comfortable troubleshooting across an entire data pipeline - from ingestion to storage to query - to ensure reliability and efficiency.
- Education: Bachelors degree in Computer Science, Information Systems, or a related field (Masters degree preferred). Strong computer science fundamentals (data structures, algorithms, systems design) as applied to data-intensive applications.
Preferred Qualifications
- FinTech / Financial Domain Expertise: Experience in the financial services or fintech industry with a strong understanding of financial data and processes. Familiarity with banking/payments data, trading systems, or regulatory reporting is a plus.
- Advanced Degree or Certifications: Masters degree in a relevant field (e.g., Data Science, Computer Science) is a plus. Professional certifications in data architecture or cloud technologies (for example, AWS Certified Data Analytics, Google Professional Data Engineer, or Azure Data Engineer certification) are advantageous.
- Master Data Management & Governance: Experience with master data management (MDM) solutions and data governance tools. Knowledge of industry-standard data models or frameworks for finance (e.g., IBM Banking Data Warehouse, FIBO) can be beneficial in accelerating data design.
- Big Data Ecosystem: Familiarity with big data ecosystems and tools not already mentioned. Experience working with Hadoop clusters, Apache Hive, or Spark-based analytics in a large-scale environment is a plus. Exposure to data lakehouse platforms like Databricks or cloud analytics stacks beyond core warehouses will be valued.
- DevOps and CI/CD: Understanding of modern software development practices as they relate to data. Experience with Agile methodologies and using CI/CD pipelines for deploying data pipeline code or database changes. Experience containerizing data services or using orchestration (Docker, Kubernetes) for deploying data infrastructure is a plus.
- Leadership and Collaboration: Demonstrated ability to lead data architecture initiatives and work collaboratively in a cross-functional team. Experience mentoring data engineers or leading a team of data professionals in delivering on a data strategy is highly regarded. Excellent communication skills to effectively convey complex data concepts to both technical teams and business stakeholders.
Job Type: Full-time
Pay: Rs700,000.00 - Rs1,400,000.00 per month
Experience:
- data architecture: 8 years (Preferred)
Information :
- Company : Traderware
- Position : Data Architect
- Location : Karachi
- Country : PK
Attention - In the recruitment process, legitimate companies never withdraw fees from candidates. If there are companies that attract interview fees, tests, ticket reservations, etc. it is better to avoid it because there are indications of fraud. If you see something suspicious please contact us: support@jobkos.com
Post Date : 2025-05-09 | Expired Date : 2025-06-08