Hello, Welcome to my portfolio website! My name is Mehdi TAZI, I'm an Entrepreneur, Chief Technical Officer & Data Architect from Fez, Morocco living in Paris, France.
I have an Engineering degree in Software Engineering & Master 2 degree in Distributed information systems, I m also an academic (a PhD Student), my researches focus on the definition of 'sensors virtualization and composition approach for the Internet of Things'. I am also an author and reviewer of articles and books mainly related to the field of technology and data architecture, sharing my expertise and insights with a wider audience.
Currently I work as a Big Data/Cloud Architect, my role involves in designing and evolving a Data Platform, incorporating concepts like Datamesh and LakeHouse predominantly on AWS in a multi-cloud context with high focus on governance, data quality and UX. My daily tasks include setting-up technical, functional, and organizational solutions, enhancing data platform tooling according to the company strategy, setting-up guidelines, making the platform observable and well-governed, onboarding new teams into the platform, defining or validating their architectures, and assisting in the implementation of their use cases with a focus on cost and performance optimizations. My role involves also in managing projects mainly using agile methodologies.
I've been passionate about computer science since the age of 14 and being a software architect is more than just a job for me. In addition, I find satisfaction in building and investing in businesses related to Software, Real estate and Stocks.
Aisde from Business & Computer science fields, I enjoy learning new things mainly in the Economy, Science & Philosophy fields. During my time off, I enjoy traveling and discovering new cultures.
I have been fortunate to rise successful businesses and investments, which have given me valuable experience and insight into the world of entrepreneurship. I'm always on the lookout for new and innovative opportunities.
Throughout my career, I have been dedicated to delivering high-quality work and have successfully completed over 100 IT Projects to date, showcasing my skills and abilities.
I also have a dedication for simplifying complex ideas. I've authored a book and reviewed others about software engineering and data architecture fields, while also sharing insights on Medium and synthesized publications on LinkedIn to make this field more accessible
I have a passion for exploring new cultures and have been fortunate to visit 44 countries so far, broadening my perspectives and gaining valuable experiences along the way.
In this section, I'll summarize my professional education, experiences and skills.
2017 - 2023
2022 - 2025
2018 - 2022
2017 - 2018
2016 -
2010 - 2011
2006 - 2010
2000 - 2045
- Data Platform Design & Architecture: Consulting and Designing Data Platforms, LakeHouses, Datalakes, and Modern Data Stack solutions. This includes Organization, Architecture, Modeling, Governance, Security, Monitoring, and Data Observability. Development and maintenance of the IT architectural strategy and enterprise vision, continually evaluating tech trends, business needs.
- Integration & High Availability: Designing and integrating existing systems, including BI, analytical and data science systems, with big data platforms. Ensuring high availability through BRP (Business Resumption Planning) and BCP (Business Continuity Planning), and designing highly available, reactive, real-time, secure, and governed solutions that maintain system integrity and continuity.
- Data Layer Design & Engineering: Designing data platform storage layers & zones. Designing ingestion, exchange, logging/archiving, modeling, and exposition features. I specialize in distributed/parallel scalable BigData/NoSQL architectures, coding and consulting on architectural designs. Data Engineering expertise includes Java/Scala/Python with tools such as AirFlow, Spark, DBT, AWS (EMR, GLUE, Athena, Lambda), and Databricks.
- Automation, DevOps & Industrialization: Leading the industrialization and automation of infrastructures and projects using DevOps practices, Infrastructure as Code, and MLOPS strategies. This includes implementing comprehensive engineering stacks and ensuring efficient CI/CD pipelines & IaaC for rapid and reliable deployment cycles.
- Requirements, Monitoring & Observability: Collecting business requirements and setting up both functional and technical solutions. This involves implementing monitoring, audits, and data observability mechanisms to ensure system reliability, security, and governance while providing actionable insights through data metrics and analytics.
- Technical Writing & Presentations: Engaging in pre-sales activities and writing detailed technical specifications. Serve as a Technical Keynote speaker and trainer, deliver presentations that communicate complex ideas clearly and effectively to diverse audiences, ranging from technical teams to executive leadership.
- Accompany teams: Accompany teams setting up & onboarding their use cases.
- Leadership & Planning: Coach, Lead, Estimate, Plan, Organize and monitor projects.
- Team Coordination: Animate meetings and motivate teams, Facilitate relations between and within teams.
- Communication: Communicate on the goals & visions and write reports to the top management.
- Release & Backlog Management: Prepare the releases, iterations and update the user stories and the backlog with the product owner.
- Product Breakdown: Split the product into Use cases and prepare the releases, then Split Use cases into tasks and prepares the iterations.
- Resource Management: Establishment of resources availability and of the production capacity tables.
- Indicators: Generating indicators like (BurnUp/Down charts, Team velocity, Tasks monitoring, Code Quality, Scope).
- Risk Management: Identifications and risks management.
- Plateforms : AWS, Azure, Databricks, Dataiku, Cloudera, Confluent, Aiven, etc.
- Storage & Format: NOSQL: Cassandra, MongoDB, DynamoDB, CosmosDB, HBase, Redis, Postgres, RDS. FILE & Object storage : HDFS, S3, ADLS2, GCS. FORMAT : DeltaLake, IceBerg, Parquet, YAML, AVRO, CSV, XML, JSON, etc.
- Processing & Streaming: Kafka, Kinesis, EventHub, Spark, HADOOP, AWS Glue, EMR, DMS, Lambda, EKS, DBT, Talend, etc.
- Consumption & Exposition: Athena, Impala, Trino/Presto, GraphQL/REST, PowerBI, Tableau, QuickSight, SageMaker.
- Automation & Scheduling: Docker, Airflow, Ansible, CloudFormation, Azure DevOPS, Jenkins, Terraform.
- Environments: OnPremise, Cloud, Multi-Cloud, Hybrid architectures.
- Languages: Python, Scala, Java, SQL, CQL, JS.
Mehdi is a talented, reliable and extremely knowledgeable data professional. his guidance and advises was instrumental to build and maintain our big data platform at Axa. It is also extremely pleasant to work with Mehdi.
Mehdi participated in the design and implementation of our big data solution for which I was responsible. He helped us define and quickly deploy a solution that effectively met our needs. Being an architect with several successful experiences of effective deployment in production, I can only recommend his profile which is rare today on big data subjects.
As part of Engie Global Market's trading surveillance team, we have started a new big data project with many new challenges. Mehdi has helped us tremendously not only on the technical aspect as the big data expert and cloud architect but also on the human aspect and project management. He is always attentive to the client and knows how to coordinate the different devops teams. a real success.