Amazon call center Job work from Home | Amazon Job work from Home | Amazon India work from Home
Company Name: Amazon
Salary Package; As per the Market Standards
Work Experience: no requirements
Job Location: delhi
Qualification Required: 10th
Language: English/Hindi
- Responsibilities:The chosen candidate will be responsible for the following:design, build, test and maintain reusable data pipelines for ingestion and processing of data design, build, test, and maintain data management systems make data accessible to organizational stakeholders; work in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issues with data creation of new data validation processes and implement analytical tools assembling of huge, complicated datasets that adhere to business needs identification and implementation of internal procedural enhancements creation of necessary infrastructure for ETLjobs from a wide range of data sources collecting data requirements, maintaining metadata about data oData security and governance with modern-day security controls oData storage with technologies like Hadoop, NoSQL, Amazon S3, etc. oData processing with newer tools that help in data management from disparate sources finding hidden patterns from data chunksandcreating models integration of data management processes into the organizations current structure oHelp in third-party integration and develop a robust infrastructure to conduct research and identify automation tasks Required Technical and Professional Expertise
Experience: strong implementation experience in building and troubleshooting data pipelines inApache /Cloudera/Hortonworks Big Data ecosystem(on-prem / cloud)understanding of infrastructure (including hosting, container-based deployments, and storage architectures) should possessarchitecturaland troubleshooting skills on commonly seen technicalstackoPerformance optimization knowledge of Hadoop platformoDeepknowledge in core activities like cluster setup, upgradesand operationsoScripting(Python, Scala, PowerShell, bash, JSON scripting), automation, setup installation and configuration, troubleshooting and fixing issues across platforms intermediate (or) strong knowledge ofCI/CD tools like Git, Jenkins, Ansible, Splunk oHands-on experience GCP/AWS/Azure cloud platformsoStrong knowledge/experience on Relational and NoSQL databases
Preferred Technical and Professional Experience
Education and Certifications:(a combination of the following)college degree in Computer Science / Data ScienceoGoogle Professional Data EngineerCertificationoAzure Data Engineer Certification (Associate Exams-DP -200/DP-201/DP-203)AWS Certified Developer Certification AssociateoAWS Certified Data Analytics Specialty Certification (DAS-C01).Professional Attributes: self-starter who thrives in an ever-changing, fast-paced business environment adaptability to the rapid change in technology familiarity with Agile and Design Thinking methodologiesoFamiliarity with DevSecOps and CI/CD pipeline collaboration across different cultures and different technical domains strong communicator with excellent spoken and written communication skills. Required Education
Bachelor’s Degree
Preferred Education
Master’s Degree
- IT Software: Software Products & Services
- IT-Hardware/Networking, Telecom
- Software Engineer
- Any Graduate
- Full Time