Big Data Solutions Architect
The Big Data Solution Architect will play a critical role in defining the features of Big Data capabilities like Java, Hadoop &NoSQL databases as well as by leveraging traditional DW technologies where applicable.
Primary Responsibilities:
• Architect solutions for key business initiatives ensuring alignment with future state analytics architecture vision
• Work closely with the project teams as outlined in the SDLC engagement model to provide guidance in implementing solutions at various stages of projects
• Engage constructively with project teams to support project objectives through the application of sound architectural principles
• Develop and validate that the proposed solution architecture supports the stated &implied business requirements of the project
• Review technical team deliverables for compliance with architecture standards and guidelines
• Adopt innovative architectural approaches to leverage in-house data integration capabilities consistent with architectural goals of the enterprise
• Create architectural designs for different stakeholders that provide a conceptual definition of the information processing needs for the delivery project
• Provide technical and architectural subject matter expertise on Hortonworks Data Platform (Data mgmt- Hadoop, Data Access- HBAse, Hive/Pig, Spark, Storm., Integration - Kafka, Flume..)
• Execute engagements as hands on Technical Architect and provide technical expertise for Big Data programs
• Provide right-fit solution to implement variety of data, analytics requirements using Big Data technologies in Insurance/Healthcare space
• Provide expertise in defining Big Data Technical Architecture and design using Hadoop, NoSQL and Visualization tools/platforms
• Ability to get down to the programming/code level and provide hands-on expertise in technical features of Big Data tools/platforms
• Lead a team of designers/developers and guide them throughout the system implementation life cycle
• Engage client Architects, Business SMEs and other stakeholders during Architecture, Design and implementation phases
Preferred Qualifications:
• Experience with Hadoop ,NoSQL, Stream processing Big Data tools
• Experience with real-time processing technologies such as Spark, Kafka and Storm
• Experience with ETL
• Experience with Big Data Architecture
• Experience in implementing one or more Big Data platforms from Cloudera, Hortonworks, IBM Big Insights and NoSQL DBs such as Mongo DB
• Experience with Data Mart/Data Warehouse
• Experience with Meta Data Tools and Techniques
• Experience with Cloud based Hadoop stack (Azure HDInsight, AWS etc..) , .NET technologies etc..
Subscribe to job alerts and upload your resume!
*By registering with our site, you agree to our
Terms and Privacy Policy.