Job Description

Giza Systems

Data Architect - Design Office

Job id: 566900

19 Feb 2025

Job Location

Saudi Arabia

Experience

10 to 20 years

Qualification Level

Graduate

Job Function

IT - Software

Skillset

data management solutions, and data virtualization

Jobseeker Nationality

Jobseekers from any country

The main purpose of the solution architect position is designing software solutions and technical leadership of software delivery teams and accountable for the technical acceptance of the delivered solution by the customer.

- Participates in vendors assessment and selection.

- Participates with presales team in the proposed solution design in the bidding phase.

- Prepares the “team” Scope of Work (SoW) proposal write-up for bidding in software projects.

- Prepares “team” professional services sizing, assumptions, and pre-requisites for bidding in software projects.

- Participates in customer demonstrations and presentations to discuss and convince the customer by our software solutions.

- Attend requirement gathering workshops and prepares business requirements documents.

- Prepares strategy documents of the project (i.e., configuration management strategy, migration strategy document, go-live strategy … etc.)

- Participates and review testing strategy document.

- Prepares high level design documents including end-to-end solution architecture and integration scenarios with the help of the technical architects.

Proven experience as architect and engineering lead in Data & Analytics stream

In-depth understanding of data structure principles and data platforms

Problem-solving attitude, solution mindset with implementation expertise

Working experience on Modern data platforms which involves big data technologies, data management solutions, and data virtualization

Well-versed with the end2end data management philosophies and governance processes

Has pre-sales experience and have involved in RFP/RFI/RFQ processes

Creative problem-solver with strong communication skills

Excellent understanding of traditional and distributed computing paradigm

Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts

Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries

Technical Skills:

Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka

Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in Snowflake modelling would be an advantage

Data visualization – Tools like Tableau, Power BI and Kibana

Master data management (MDM) – Concepts and expertise in tools like Informatica & Talend MDM

Big data – Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig and HIVE

Data processing frameworks – Spark & Spark streaming

Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required

Knowledge of various data modelling techniques and should be hands on with data modelling tools like ERWin, TOAD, PowerDesigner, etc.

Experience in cloud data eco-system - AWS, Azure or GCP

Demonstrate strong analytical and problem solving capability

Good understanding of the data eco-system, both current and future data trends

Personal Skills

- Preferred to be TOGAF certifi

Proven experience as architect and engineering lead in Data & Analytics stream

In-depth understanding of data structure principles and data platforms

Problem-solving attitude, solution mindset with implementation expertise


Working experience on Modern data platforms which involves big data technologies, data management solutions, and data virtualization


Well-versed with the end2end data management philosophies and governance processes


Has pre-sales experience and have involved in RFP/RFI/RFQ processes

Creative problem-solver with strong communication skills

Excellent understanding of traditional and distributed computing paradigm

Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts

Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries

Technical Skills:

Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka

Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in Snowflake modelling would be an advantage

Data visualization – Tools like Tableau, Power BI and Kibana

Master data management (MDM) – Concepts and expertise in tools like Informatica & Talend MDM

Big data – Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig and HIVE

Data processing frameworks – Spark & Spark streaming

Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required


Knowledge of various data modelling techniques and should be hands on with data modelling tools like ERWin, TOAD, PowerDesigner, etc.

Experience in cloud data eco-system - AWS, Azure or GCP

Demonstrate strong analytical and problem solving capability

Good understanding of the data eco-system, both current and future data trends

Technical Skills:

•Proven experience as architect and engineering lead in Data & Analytics stream•In-depth understanding of data structure principles and data platforms•Problem-solving attitude, solution mindset with implementation expertise

•Working experience on Modern data platforms which involves big data technologies, data management solutions, and data virtualization

•Well-versed with the end2end data management philosophies and governance processes•Has pre-sales experience and have involved in RFP/RFI/RFQ processes

•Creative problem-solver with strong communication skills

Excellent understanding of traditional and distributed computing paradigm

•Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts

•Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queriesTechnical Skills:•Data integration - ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka

•Data modelling - Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in Snowflake modelling would be an advantage

• Data visualization - Tools like Tableau, Power BI and Kibana• Master data management (MDM) - Concepts and expertise in tools like Informatica & Talend MDM

• Big data - Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig and HIVE• Data processing frameworks - Spark & Spark streaming

• Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required

• Knowledge of various data modelling techniques and should be hands on with data modelling tools like ERWin, TOAD, PowerDesigner, etc.

• Experience in cloud data eco-system - AWS, Azure or GCP• Demonstrate strong analytical and problem solving capability• Good understanding of the data eco-system, both current and future data trends

Education:

Bachelor's degree in Computer Science, Software Engineering, or a related field.

You will be redirected to the company website to apply for this position

Disclaimer: GotoGulf is a platform that facilitates recruiters and jobseekers reach out to each other. Applicants are advised to research the bonafides of recruiters independently. We do not endorse requests for money payments and strictly advise against sharing personal or financial information. If you suspect malpractice, please email to us.