Data Architect

Data Architect

Devoteam
9-12 years
Not Specified

Job Description


JOB DESCRIPTION
You will be responsible for designing and optimizing big data and data warehouse architecture, as well as optimizing data flow and pipelines for cross functional teams. You are a technical guru when it comes to selecting the right tools for implementing data ingestion, processing, and storage. Security, performance, scalability, availability, accessibility, and maintainability are your top priorities when designing data solutions. You have a deep, broad, and hands-on experience in the various technologies from Hadoop ecosystem, NoSQL, RDBMs, ingestion, and processing.
Responsibilities:

  • Provide thought leadership and drive architecture and the design of big data and data warehousing solutions.

  • Clearly articulate pros and cons of various technologies and platforms

  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

  • Design the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources

  • Document use cases, solutions and recommendations

  • Work with various stakeholders, from program and project managers to the solution and enterprise architects in the design, planning and governance of implementing data projects

  • Design and strategy to keep data separated and secure across national boundaries through multiple data centers and regions.

  • Identify ways to improve data reliability, efficiency and quality

  • Design a strategy to insure data availability, accessibility, scalability, and maintainability

  • Perform detailed analysis of business problems and technical environments and use this in designing the solution

  • Explore new opportunities introduced by new technologies and initiatives

  • Benchmark implemented solution against other solutions from other vendors

  • Build capabilities and guide the technical team and push them to progress in their career

QUALIFICATIONS

  • 9-12 years experience in data warehousing and big data projects.

  • Deep and broad experiences in Hadoop ecosystem, including HDFS, MapReduce, Hive, HBase, Impala, Kudu, Solr, etc..

  • Hands-on experience in multiple NoSQL databases like Cassandra, MongoDB, Neo4j, ElasticSearch and ELK Stack

  • Experience with stream-processing systems: Storm, Spark-Streaming, Flink, etc..

  • Experience with real-time messaging platform like KAFKA, Kinesis, etc..

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases, including distributed relational database like SingleStore and Vitess

  • Experience in building and optimizing ‘big data data pipelines

  • Strong analytic skills related to working with unstructured datasets.

  • Experience in object stores like MINIO and ceph

  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.

  • Proven record of building highly available and always-on data platforms

  • Linux shell scripting

  • languages: Python, Java, Scala, etc.

  • Fluent in English & Arabic

About Devoteam

Similar Jobs

People Also Considered

Data Not Available

Career Advice to Find Better

Simple body text this will replace with orginal content