Role Description :Develop analytics based solutions that produce quantitative and qualitative business insights. Work with partners as necessary to integrate systems and data quickly and effectively, regardless of technical challenges or business environments.
Must have Skills :Hadoop
Good to Have Skills : Spark Programming, AWS Architect
Job Requirements :Role: Data ArchitectSkill: Hadoop, AWS, Spark
Must Have:1 Experience in building a highly scalable, highly available, Big Data Applications with Hadoop, Hive, Spark, Presto 2 API Gateway and Building APIs with Lambda, AWS Firehose and Kinesis Analytics, AWS Glue and Managed ETL3 Coding experience in PySpark, create monitoring through cloud watch, experience in building CI/CD with Git and associated tools4 Experience in building a highly scalable, highly available, cloud native application on AWS
Good to Have:1 Working knowledge in Cloud Formation to be able to create Infrastructure as Code, Working knowledge of managing AWS Account cost optimization, security, access control2 Working knowledge of setup/managing Linux Servers on Amazon EC2, EBS, ELB, SSL, Security Groups, RDS and IAM, knowledge of VPC, Subnets, Amazon Cloud Front 3 Certifications: AWS Developer Certification Preferred, AWS Solution Architect Certification