This job board retrieves part of its jobs from: Driver Jobs | Trucking Jobs | Illinois Jobs

Find jobs in the city of Chicago today!

To post a job, login or create an account |  Post a Job

[smartslider3 slider="2"]
New

AWS DataEngineer

Data ERP Sys LLC

This is a Full-time position in Chicago, IL posted February 22, 2021.

Position: AWS Data Engineer Location: Chicago, IL Rate: Market nn nn Experience in building, optimizing, and maintaining data pipelines and data architecture on Amazon Web Services Formulating next generation analytics environment, providing self-service, centralized a platform for any and all data -centric activities Ingesting data (structure, unstructured, semi-structured) in a controlled and transparent process Designing and developing data API’s for reusability and for the purposes of reporting and moving from batch to real-time data integrations.

Design and develop data processing/transformation frameworks leveraging open-source tools.

Working with distributed and scalable Big Data storage, processing, and computation Building, testing, and implementing advanced analytic business products, including pilots and proof of concept Applying knowledge of current industry trends and techniques to formulate solutions within the context of assigned projects and/or enhancements Ensuring customer satisfaction through professional communication, follow-up, and responsiveness to issues Building effective relationships with stakeholders nn nn nn Qualification: Knowledge and Experience nn nn nn 5+ years of professional experience building resilient, scalable, and performant data platform solutions using AWS.

Expert level understanding and hands-on experience of data lake fundamentals and building efficient data pipelines Work with AWS native technologies such as EC2, Athena, Glue, Data Pipeline, Lake Formation, Data Lake, Glue Data Catalog, RDS, S3, Lambda, Kinesis, Redshift, API Gateway, CloudWatch, CloudTrail, AWS Config, AWS SDK.

Strong Experience in building Data lake using Spark.

Experience with Agile methodologies Experience with relational SQL and NoSQL databases, including Postgres.

Experience in functional, object-oriented, and scripting languages: Python, Java, Scala Experience developing and working with RESTful APIs for data transfer.

Experience in git, git-flow, and CI/CD Experience in data format and serialization: JSON, XML, pickle, Avro, parquet or protocol buffer Experience in Machine Learning pipeline development: hyper parameter tuning, model validation, model serving, model performance monitoring.

Experience in automation AWS services deployment using Terraform scripts.

Having experience in Reporting tools like Cognos, Tableau is an added advantage.

Good knowledge in self-services tools like ThoughtSpot is added advantage.

nn Tech Stack: nn Hadoop and Informatica tools: PowerCenter, MDM, Big Data Manager, Spark, Kafka, Hive, HBase, Databases: SQL, NoSQL, Postgres AWS cloud services: EC2, EMR, RDS, Redshift, Snowflake Stream-processing systems: Spark-Streaming, Kafka Streams, Flink Reporting Tools: Cognos 10+, Tableau 2019 Self-Services Tools: ThoughtSpot
– provided by Dice

Please Share This Page !
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •   
  •   
  •   
  •   
  •   
  •