وظائف شاغرة صادرة عن شركة الشهامة لتوظيف الكفاءات الأردنية

1/

Job Purpose (Summary Of The Role)
The big data architect performs various functions, which primarily involves providing the framework of foundation Big Data Analytics.
They are responsible for engaging stakeholders to understand their objectives for Big Data and utilizing information gathered to plan the computing framework with appropriate hardware and software, data sources and formats, analytical tools, data storage decisions, and results consumption.
Requirement Specifications
Total years of experience 12 Relevant years of experience 8 years
Knowledge & Technical Skills Required The primary tasks, duties, and responsibilities that typically make up the big data architect job description are shown below:
• Provide top-quality solution design and execution
• Provide support in defining the scope and sizing of work
• Align the organization’s big data solutions with their Client initiatives as requested
• Engage with clients to understand strategic requirements
• Responsible for translating business requirements into technology solutions
• Work with domain experts to put together a delivery plan with and stay on track
• Utilize Big Data technologies to design, develop, and evolve scalable and fault-tolerant distributed components
• Organize all meetings with customers and ensure prompt resolution of gaps and roadblocks
• Stay current on latest technology to ensure maximum ROI .
• Responsible for the design and execution of abstractions and integration patterns (APIs) to solve complex distributed computing problems.
• Experience in Requirements Engineering, Solution Architecture, Design, Development and Deployment
• A broad set of technical skills and knowledge across hardware, software, systems and solutions development and a across more than one technical domain
• Strong technical team leadership, mentorship and collaboration.
• Experience in service architecture, development, and high performance and scalability.

Certification specifications if any
Educational Qualifications Computer Sc.

Email:[email protected]

————————————————————————————

2/

Job Purpose (Summary Of The Role)
We are looking for a Big Data Engineer that will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company
Requirement Specifications
Total years of experience 8 Relevant years of experience 6 years
Knowledge & Technical Skills Required • Graduate degree in Computer Science, Information Systems or equivalent quantitative field and 6+ years of experience.
• Experience working with and extracting value from large, disconnected and/or unstructured datasets.
• Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.
• Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency and workload management.
• Strong interpersonal skills and ability to project manage and work with cross-functional teams.
• Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
• Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Experience with integration of data from multiple data sources
• Experience with the following tools and technologies:
• Cloudera, Hadoop, Spark core, Spark SQL, Scala, Hive, Sqoop, Kafka.
• Data ingestion [Apache NiFi]• Big Data querying tools, Pig, Hive, HBase, Impala, YARN and HDFS
• Zookeeper, Ranger, Atlas, Kudu
• Relational SQL and NoSQL databases
• Data pipeline/workflow management tools such as Azkaban and Airflow
• Stream-processing systems such as Storm and Spark-Streaming
• API integration with bigdata
• Object-oriented/object function scripting languages such as Python, Java, C++, etc.
• Experience with ETL Processes
Certification specifications if any
Educational Qualifications Computer Sc.

Email:[email protected]

——————————————————————————

3/

Job Purpose (Summary Of The Role)
This role is responsible for leading efforts related to go-to-market analytics & must deliver on key projects and dashboards that drive more effective. This role requires balancing and influencing multiple competing priorities and individuals.
Requirement Specifications
Total years of experience 8 Relevant years of experience 4 years
Knowledge & Technical Skills Required • 4 – 6 years of Database Design experience in Teradata.
• SQL and Database experience with a major RDBMS (for example Teradata)
• Hands on experience in SQL, PL/SQL routines development
• Telcom Domain knowledge
• Good communication skills (both written as well as oral). Communicates equally well with business people and technical team members.
• Creating high level Common Semantic layer Architecture based on customer expectations. Review it with SMEs from client side and baseline the high-level model.
• Owning the end to end Semantic modelling work
• Analyze business requirements to determine the data elements needed in the Semantic Layer
• Create necessary access layer objects for Facts, Dimensions to meet the business requirements
• Should have good knowledge of Star Schema Design.
• Must have good knowledge of OLAP queries, Aggregates, and usage of analytical functions in SQL queries
• Good understanding of any one BI tool like MicroStrategy
• Interact with Business SMEs to understand required data elements, the business rules governing them and confirm definitions
• Create, manage and modify conceptual, logical and physical data models for Semantic Layers. Present data models to business and technical audiences for review and validation
• Apply views, indexes and tables as required to meet the defined performance service levels (SLAs) for application access of the data warehouse

Certification specifications if any
Educational Qualifications Bachelor’s Degree

Email:[email protected]

———————————————————————————–

للإنضمام إلى قناتنا عبر التليجرام المخصصة للوظائف – أضغط هنا

للإنضمام إلى مجموعة الواتساب المعتمدة والخاصة بالوظائف الشاغرة – أضغط هنا

للإنضمام إلى صفحتنا عبر الفيسبوك الرؤية الملكية للتوظيف – أضغط هنا

للإنضمام إلى صفحتنا عبر تويتر – أضغط

زر الذهاب إلى الأعلى
Ads Blocker Image Powered by Code Help Pro

Ads Blocker Detected!!!

يبدو أنك تستخدم أداة لحظر الإعلانات. نحن نعتمد على الإعلانات كمصدر تمويل لموقعنا الإلكتروني.

Powered By
Best Wordpress Adblock Detecting Plugin | CHP Adblock

أنت تستخدم أداة مانع الإعلانات

نحن نحاول تقديم المحتوى الأفضل لك ، وحجب الإعلانات من قبلك لا يساعدنا على الإستمرار ، شكراً لتفهمك ، وعذراً على الإزعاج