Home         Register       Sign In

Company Info
PLANO, TX, United States

Company Profile

Digital Data Architect


Job ID:



Phoenix, AZ, United States 


AWS - Amazon Web Service, Big Data Architect, Hadoop ADMIN, Java Developer, Machine Learning, MapReduce, NoSql-Database, Pentaho, Platform Engineer, Predictive Analytics

Job Views:




Job Description:

  • Experience with Big Data technologies, assisting clients in building software solutions that are distributed, highly scalable and across multiple data centers
    Hands on experience in architecting Big Data applications using Hadoop technologies such as Spark, MapReduce, YARN, HDFS, Hive, Impala, Pig, Sqoop, Oozie, HBase, Elasticsearch, Cassandra.
    Experience working with Business Intelligence teams, Data Integration developers, Data Scientists, Analysts and DBAs to deliver well-architected and scalable Big Data & Analytics eco-system
    Experience working with NoSQL databases and search engines such as MongoDB, Cassandra, Elasticsearch
    Experience using Neo4J or understanding of graph databases
    Strong Experience with event stream processing technologies such as Spark streaming, Storm, Akka, Kafka
    Experience with at least one programming language (Java, Scala, Python)
    Extensive experience with at least one major Hadoop platform (Cloudera, Hortonworks, MapR)
    Proven track record of architecting Distributed Solutions dealing with real high volume of data(petabytes)
    Experience in Artificial Intelligence and Machine Learning technologies (H20, TensorFlow, IBM Watson etc)

Job Requirements:

  • A willingness to be flexible in responding to issues as they occur and the ability to identify product/deployment improvements to mitigate future occurrences.
    Self-motivated, Strong communication and team building skills.
    Work iteratively in a team with continuous collaboration.
    Your people responsibilities: as a leader you will:
    Foster teamwork and lead by example
    Participating in the organization-wide people initiatives
    Ability to travel in accordance with client and other job requirements
    Excellent written and oral communication skills; writing, publishing and conference-level presentation skills a plus

    We are looking for the candidates with the following:
    BE/BTech in computer science Or MCA with a sound industry experience (10 plus years) in various roles.
    A minimum of 5 years experience in technology or a similar role
    2+ years experience deploying data models in a production setting
    Experience in Java, Python, R or Scala. (Production level coding)
    Capability to architect highly scalable distributed data pipelines using open source tools and big data technologies such as Hadoop, HBase, Spark, Storm, ELK, etc.

Home My Account Find Jobs Post Resumes Search Resumes Post Jobs Contact About Us Sitemap terms & cond Privacy policy