Data Engineer job at MNC Companis :
1. HP Bengaluru, Karnataka, India :
- Meeting with managers to determine the company’s Big Data needs.
- Developing Hadoop systems.
- Loading disparate data sets and conducting pre-processing services using Spark, Hive or Pig.
- Finalizing the scope of the system and delivering Big Data solutions.
- Managing the communications between the internal system and the vendor.
- Collaborating with the software research and development teams.
- Building cloud platforms for the development of company applications.
- Maintaining production systems.
- Training staff on data management.
- Bachelor’s degree in computer engineering or computer science.
- Previous experience as a big data engineer.
- In-depth knowledge of Hadoop, Spark, and similar frameworks.
- Knowledge of scripting languages is preferred .
- Knowledge of NoSQL and RDBMS databases including Redis and MongoDB.
- Familiarity with Mesos, AWS, and Docker tools.
- Excellent project management skills.
- Good communication skills.
- Ability to solve complex data, and software issues.
- In this role, you'll work in our IBM Client Innovation Center (CIC), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. These centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology
- Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements.
- Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
- Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too.
- You will be involved in the design of data solutions using Hadoop based technologies along with Hadoop, Azure, HDInsights for Cloudera based Data Late using Scala Programming.
- Responsible to Ingest data from files, streams and databases. Process the data with Hadoop, Scala, SQL Database, Spark, ML, IoT
- Develop programs in Scala and Python as part of data cleaning and processing
- Responsible to design and develop distributed, high volume, high velocity multi-threaded event processing systems
- Develop efficient software code for multiple use cases leveraging Python and Big Data technologies for various use cases built on the platform
- Provide high operational excellence guaranteeing high availability and platform stability
- Implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Pyspark, Kafka, any Cloud computing etc.
- If you thrive in a dynamic, collaborative workplace, IBM provides an environment where you will be challenged and inspired every single day. And if you relish the freedom to bring creative, thoughtful solutions to the table, there's no limit to what you can accomplish here.
- Minimum 7+ years of experience in Big Data technologies and In-depth experience in modern data platform components such as the Hadoop, Hive, Pig, Spark, Python, Scala, etc
- Proficient in any of the programming languages – Python, Scala or Java
- Experience with Distributed Versioning Control environments such as GIT
- Demonstrated experience in modern API platform design including how modern UI’s are built consuming services / APIs.
- Solid experience in all phases of Software Development Lifecycle - plan, design, develop, test, release, maintain and support, decommission
- Familiarity with development tools - experience on either IntelliJ / Eclipse / VSCode IDE, Build Tool Maven
- Experience on Azure cloud including Data Factory, Databricks, Data Lake Storage is highly preferred.
Profile: Data Engineer
Experience: 3+ Years
Location: Permanent Remote
What is Uplers Talent Network?
Uplers Talent Network is a place where top talents meet the right opportunities. It is a platform for every candidate looking for a perfect opportunity to work with global companies on a contractual basis. Our talent network is a place for top Indian talents who can benefit from the platform and gain access to global career exposure.
With us, you'll get the support, guidance, and opportunities that you need to take your career to the next level. So, if you're ready to embark on the journey of your next challenge, we're ready to be your engine!
Contractual Position
A contractual position usually requires you to sign and agree to the terms of a contract before you begin working. This structure can offer a variety of commitments that allow you to refine established skills and create new ones.
Uplers Talent Network brings contractual positions with benefits like:
✔ Higher pay than industry standards
✔ Full-time position
✔ Ability to gain different skills in a short period
✔ Control over your career
Perks of joining Uplers Talent Network:
- Talent Success Coach: Get connected with a dedicated coach to guide you before, during as well as after your assignments with our clients.
- Payout: Get paid in global currencies and earn more than industry standards.
- Opportunity: Work with international companies and get global exposure with exciting projects.
- Mobility: Work from the comfort of your living room couch or even a breezy beach.
How to become a part of our Talent Network?
- Take the first step, register on our portal
- Clear the decks and fill out the application form
- Gear up, clear the 3-stage assessment process
- And yes! Become a part of Uplers Certified Talent Network
Requirement:
- 3+ years of experience in a Data Engineer role
- Experience with big data tools: Hadoop, Spark, Kafka, etc
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc
- Experience with object-oriented/object function scripting languages: Python, R, Java, C++, Scala, GoLang, etc
- Extensive knowledge of data engineering tools, technologies, and approaches
- Hands-on experience with ETL tools
- Knowledge of the design and operation of robust distributed systems
- BI tools knowledge
- Experience in Warehouse: SQL/NoSQL, Amazon Redshift, Panoply, Oracle, Talend, Informatica, Apache Hive, etc
- Bachelor's degree in data engineering, big data analytics, computer engineering, or a related field
- Knowledge of multiple data technologies and concepts
- Setup and leverage models & entities in data lake environment
- Build data pipelines on data lake platform that includes sourcing information, writing logic for ingestion, enrichment and business reporting
- Integrate data pipeline with analytical and reporting tools leveraging At-scale/Dremio and Tableau
- Participate in design discussion and code reviews
- User engagement to understand business requirements and for any issues reported
- Advanced knowledge of SQL and query optimization concepts (TSQL and/or PL/SQL)
- Well versed with Data analysis and Data modeling
- Advanced knowledge of application, data, and infrastructure architecture disciplines
- Unix shell scripting and/or Windows scripting (PowerShell, Perl, Batch scripts)
- Strong experience with relational enterprise databases (Oracle and/or SQL Server)
- Applying Process automation design principles and patterns
- Creating/maintaining ETL processes
- Good understanding of Change management process (ServiceNow)
- Knowledge of industry-wide technology trends and best practices
- Passionate about building an innovative culture
- Ability to work in large, collaborative teams to achieve organizational goals
- BS. or MS. in computer science, information systems, math, business, or engineering
- Proficiency in one or more modern programming language (Java, Python, Spark)
- Experience in migrating data workflows on-premises to public cloud
- AWS knowledge/certification (huge plus)
- Scheduling tools like Autosys or Control-M
- Familiarity with API development experience
- BI Analytical experience (Tableau, Alteryx)
0 Comments
Leave Your Comment