Get new jobs by email
- * Leads in developing, supporting and implementing data solutions for multiple applications in order to meet business objectives and user requirements. Leverages technical knowledge and industry experience to design, build and maintain technology solutions.* Leads data ...Suggested
- ...understanding of SQL and relational databases (e.g., Oracle, SQL Server, PostgreSQL). Experience working with big data frameworks (e.g., Hadoop, Spark). Familiarity with data warehousing concepts and cloud platforms (AWS, Azure, or GCP). Excellent problem-solving and...SuggestedFull time
- ...with data warehousing technologies like Snowflake, Redshift, or BigQuery. Familiarity with distributed computing frameworks (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure, GCP). Experience with version control systems (e.g., Git) and agile software development...Suggested
$150k - $165k
...knowledge of cloud platforms (AWS, Azure, GCP) and data warehousing best practices + Proficiency in big data technologies (Spark, Hadoop) and streaming frameworks + Familiarity with data governance, security, and compliance standards + Experience with audience segmentation...SuggestedLocal areaFlexible hours- ...Security & Compliance: Implementing security and data protection measures. Preferred Skills Knowledge of Elastic Stack, Hadoop ecosystem, and data analytics tools like Tableau. Exposure to AI/ML libraries (e.g., TensorFlow, PyTorch, HuggingFace)....SuggestedFull time
- ...Banking Domain Experience is Mandatory. 8-10 Years experience working in Data Engineering and Data Analysis. Hands on Experience in Hadoop Stack of Technologies (Hadoop, PySpark, HBase, Hive, Pig, Sqoop, Scala, Flume, HDFS, Map Reduce). Hands on experience with Python &...SuggestedFull timeWork experience placementWork at officeRelocation
- ...experience with Google Cloud Technologies. Basics of Python. Should have minimum 3+ years of hands‑on experience with Teradata & Hadoop. Minimum 2+ years of experience in the Banking Domain. Strong technical skills in Ab Initio, UNIX shell scripting, SQL (Teradata...SuggestedFull time
- ...Snowflake data platform. - Develop ETL processes to support data integration and transformation. - Work with tools such as Apache Spark, Hadoop, and Kafka to manage large-scale data operations. - Implement robust data warehousing strategies to support business intelligence...SuggestedLong term contractPermanent employmentContract workTemporary work
- Overview ETL Developer with hands-on experience in Ab Initio, Google Cloud, Teradata & Hadoop, and banking domain knowledge is sought. The role requires strong data integration, data warehousing concepts, and metadata management skills. Qualifications Should have minimum...SuggestedFull time
- ...aspects of ETL projects from requirement till implementation. Strong technical skills in Ab Initio, UNIX shell scripting, SQL (Teradata Hadoop) and other scheduling tools. Hands-on experience with Cloud Storage and Cloud native ETL processing tools. Extensively worked on...SuggestedFull time
$79k - $120k
...Independently deliver moderate to more complex analyses and reports; Work with large datasets - using standard tools such as Python, Hadoop, R, SQL, SAS and Google Cloud - to solve business problems; Analyze and resolve anomalies discovered when using quantitative tools...SuggestedTemporary workWork experience placementWork at officeImmediate startRemote workWork from homeHome officeFlexible hours$98.1k - $196.2k
...SumoLogic, etc. Relational database design and queries such as Postgres, MySQL or similar Big data solutions such as Elasticsearch, Hadoop, Apache Spark Expertise in one of the major cloud providers: AWS, GCP or Azure Understanding of cloud security concepts and...SuggestedContract workRelocation$63.8k - $205.8k
...with AI Native and cloud platforms (e.g., Open AI, Anthropic, AWS, Azure, Google Cloud). + Knowledge of big data technologies (e.g., Hadoop, Spark). + Knowledge of agent-based modeling, reinforcement learning, and autonomous systems is highly desirable. + Familiarity...SuggestedLive inWork at officeLocal area$77.97k - $171.06k
...closely with multiple teams and Business partners, for collecting requirement and providing optimal solution - Proven experience on Hadoop cluster components and services (like HDFS, YARN, ZOOKEEPER, AMBARI/CLOUDERA MANAGER, SENTRY/RANGER, KERBEROS, etc.) - Ability to...SuggestedWork experience placementRemote work$99k - $232k
...normalization, OLAP, OLTP, Vault data model, graphs, star & snowflake schemas); + Possessing work experience in Big data engineering (Hadoop, Spark, Scala, Kafka) and ETL/ELT pipeline development (tools: IICS/AWS Glue/SAP BODS/Matillion/DBT/Abinitio/SSIS/SnapLogic);...SuggestedFull timeWork experience placementH1b$77k - $202k
...OLTP, Vault data model, graphs, star & snowflake schemas); + Applying knowledge and relevant work experience in Big data engineering (Hadoop, Spark, Scala, Kafka) and ETL pipeline development tools (tools: IICS/AWS Glue/Matillion/Abinitio SSIS/SnapLogic); preferable in P&...Full timeWork experience placementH1b$107.6k - $198.4k
...understanding of database platforms including relational and non-relational data engines (MS SQL, MySQL, PostgreSQL, MongoDB, MariaDB, Hadoop, Snowflake, Hive, Spark, Big Query, Redshift, Data Warehouse, and similar) + 5+ years of experience with data processing (ETL)...Visa sponsorship$144k - $329.1k
...advanced analytics needs. Oversee cloud-based data management platforms and technologies such as Databricks, Snowflake, Azure, AWS, and Hadoop for ingestion, storage, and processing. Drive the design and implementation of analytical models, data pipelines, and...Summer holidayWork at officeFlexible hours$125.5k - $230.2k
...advanced analytics needs. Oversee cloud-based data management platforms and technologies such as Databricks, Snowflake, Azure, AWS, and Hadoop for ingestion, storage, and processing. Drive the design and implementation of analytical models, data pipelines, and...Summer holidayWork at officeFlexible hours$107.03k - $250.45k
...environment but candidates with suitable experience in other industries will be considered. Knowledge of big data technologies (e.g., Hadoop, Spark). Familiar with relational database concepts, and SDLC concepts. Demonstrate critical thinking and the ability to bring...Remote jobWork experience placement$106.9k - $176.5k
...analytics needs. Oversee cloud-based data management platforms and technologies such as Databricks, Snowflake, Azure, AWS, and Hadoop for ingestion, storage, and processing. Drive the design and implementation of analytical models, data pipelines, and...Summer holidayWork at officeFlexible hours$77.97k - $155.51k
...Experience with Data Science techniques and languages like Python and R is an added advantage. 3+ years with Microsoft Azure, AWS, or Hadoop. 3+ years with predictive modeling in healthcare quality data. 3+ years in analysis related to HEDIS rate tracking, Medical...Remote jobWork experience placementWork at office$110.5k - $171.93k
...of our employees feel respected, valued and have an opportunity to contribute to the company’s success. As a Data Engineer Senior (Hadoop/Big Data) within PNC's Technology organization on the C&I B Data Management team, you will be based in Pittsburgh, PA, Cleveland, OH...Full timeTemporary workWork experience placement$65k - $165.6k
...requirements and deliver data solutions -Leveraging cloud platforms (e.g., AWS, Azure, Google Cloud) and big data frameworks (e.g., Hadoop, Spark, Kafka) to build robust data architectures -Utilizing DevOps practices and CI/CD tools to automate and streamline data...Full timeTemporary workPart timeWork experience placementWork at officeRemote work$55k - $114.2k
Overview Data Analyst Senior - Data and Automation (SQL, Cloudera, Hadoop). Based in Strongsville-OH. At PNC, our people are our greatest differentiator and competitive advantage in the markets we serve. We are united in delivering the best experience for our customers...Full time$63.25k - $158.7k
...based in a location within the PNC footprint. Job Requirements: - Strong foundation in Python programming and SQL. - Proficient in the Hadoop ecosystem (e.g., HDFS, Hive, Spark). - Proficient in the Neo4j ecosystem (e.g., Browser, Cypher, Bloom). - Familiarity with version...Full timeTemporary workPart timeWork experience placement$124k - $280k
...schemas); Demonstrating relevant project management experience in organizing and leading teams including Big data engineering (Hadoop, Spark, Scala, Kafka) and ETL pipeline development (tools: IICS/AWS Glue/SAP BODS/ SSIS/SnapLogic); preferable in P&C Insurance data...Full timeH1b$102k - $158.7k
...STEM OPT for this position. Needed skills: Skills - Extensive experience with big data technologies and distributed computing tools (Hadoop, Spark, Hive Kafka, etc.) - Strong proficiency in relevant programming languages such as Python, PySpark, Spark, SQL - Proficient...Full timeTemporary workPart timeWork experience placementWork at officeRemote workFlexible hours$65k - $194.35k
...development of detailed analytics that provide actionable insights into key business performance metrics - Demonstrate command of Hadoop, Spark 3m Python, Kafka, and complex orchestration. - Communication and lead experience over a group of data engineers. - Work independently...Full timeTemporary workPart timeWork experience placement$55k - $179.4k
...interfaces such as Angular, APIs, security -Perform vulnerability remediation to mitigate risk -Proven experience with databases (Oracle, Hadoop, MongoDB, Neo4j) Skills -Software architect experience -UI Development and interfaces, database, security, and overall design -...Full timeTemporary workPart timeWork experience placement