Get new jobs by email
- ...off & paid holidays Paid Parental Leave (maternity & paternity) Educational Assistance Program Dress for your day Did we spark your interest? Build your future with us and apply! HR Contact: Katherine Therese HENNESSY BUILDING INCLUSIVE WORKSPACES At...SuggestedWork experience placementWork at officeLocal areaFlexible hoursShift work
- ...years of experience building data pipelines in cloud environments ~4+ years of experience with Big Data technologies (e.g., Spark, Hadoop) and cloud architecture ~3+ years of experience with reporting and analytics tools (e.g., Tableau, Power BI) ~ Hands...SuggestedLong term contract
- ...To Expect As a Tesla Construction Technician, you will play a crucial role in new construction and building improvements at the Sparks Factory and surrounding facilities. You will be responsible for safety and general labor duties. Completion and documenting safety...SuggestedHourly payFull timeTemporary workFlexible hours
- ...opening Contact L2 Data Center Technician do go through the details and kindly send me the updated resume. Location : Onsite - Sparks, NV Type of Hire : Contract Mode of interview : WebEx / Teams What are the top 3 skills required for this role?...SuggestedContract workRemote workRelocation
- ...Location: Sparks, NV Date: 10/01/2025 Wage: 60k-90k/Year Safety Professional Apollo Mechanical Contractors has an opening for a Safety Professional based out of Reno Nevada. This position will coordinate and administer Site Safety Programs for compliance with...SuggestedFor contractorsWork experience placementLocal areaShift work
- ...and SQL ~ Hands-on experience with AWS services (S3, Glue, Redshift, Lambda, EMR) ~ Experience with big data tools like Apache Spark, Hadoop, or Kafka ~ Knowledge of Machine Learning concepts and model lifecycle ~ Experience with ETL/ELT frameworks and data...SuggestedRemote work
- ...equivalent): AWS or Azure Data Platform Services, Postgres/Oracle/DB2, Collibra, Databricks, Delta Lake, Python, Snowflake, ETL tools (Spark, etc.), CI/CD pipelines supporting Data Lakehouse Expertise in real time and batch data ingestion architectures (Kafka/Event...SuggestedHourly payContract work
- ...Proficient SQL; relational (PostgreSQL, MySQL) and NoSQL (MongoDB) experience Cloud & Big Data: AWS/Azure/Google Cloud Platform, Spark, Hadoop, scalable storage (S3, Blob, HDFS) Ready to Apply? Take the next step in your data engineering career with this exciting...SuggestedContract work
- ..., improving database performance, and developing quality audits. Proficient with the Azure stack, including Synapse Analytics, Spark/Python, Azure SQL, Azure Data Factory (ADF), Kusto, among others. Knowledgeable in ETL tools, particularly Azure Data Services,...SuggestedContract work
$75 - $80 per hour
...Qualifications Knowledge of MLOps practices and ML pipelines Experience with data platforms such as Snowflake, Databricks, or Spark Familiarity with AI frameworks like TensorFlow or PyTorch Cloud certifications such as AWS Certified Solutions Architect...SuggestedFull timeContract workTemporary workWork experience placementImmediate startWorldwideFlexible hours- ...Cleansing, deduplication, parsing, and merging of high-volume datasets Parsing EBCDIC/COBOL-formatted VSAM files using Spark-Cobol Library Connecting to Db2 databases using JDBC drivers for ingestion For applications and inquiries, contact: hirings...SuggestedRemote work
- ...Must be Local to Reston, NO RELO - OnSite 3 days a week. Top 5 Technical Skills: Python (Big Data Pipeline) AWS Hadoop, Spark, Hive EMR Terraform Job Description: Strong Python development to build a big-data pipeline for data processing and analysis Need strong...SuggestedContract workWork experience placementLocal area3 days per week
- ...transformation, and data reliability Real-time systems Build streaming solutions using Kafka or Azure Event Hubs Use Spark Structured Streaming for high-volume data processing API and integration Lead integrations using Spring Boot, Dell Boomi,...Suggested
- ...resume and contact details. Core Technical Skills: SQL Server SQL Server Integration Services Azure Synapse Spark Microsoft Fabric Required Skills & Experience: ~ Bachelor's or Master's degree in computer science, information...SuggestedContract workWork at officeRemote work
- ...and SQL ~ Hands-on experience with AWS services (S3, Glue, Redshift, Lambda, EMR) ~ Experience with big data tools like Apache Spark, Hadoop, or Kafka ~ Knowledge of Machine Learning concepts and model lifecycle ~ Experience with ETL/ELT frameworks and data...SuggestedRemote work
- ...platform engineer to design and build a containerized API layer that abstracts and governs interactions with engines such as Apache Spark through a well-defined API contract. This role focuses on building platform capabilities, not simply consuming existing data tools enabling...Contract work
- ...Mobile etc) Preferred qualifications Experience with streaming API s like Kafka Understanding of Big Data/Data Lake technologies (Spark, Hadoop, Databricks etc) Understanding of Design patterns and clean coding Understanding of technical aspects of Analytic...Local area
- ...Petabyte-level data systems. Experience with cloud-native data tools and architectures (e.g., Redshift, Glue, Airflow, Apache Spark). Proficient in automated testing frameworks (PyTest, Playwright or Jest) and testing best practices. Experience developing...Long term contractRemote work
- ...Google Cloud Platform) ~ Familiarity with REST APIs and microservices architecture ~ Experience with data processing tools (Spark, Pandas, etc.) Preferred Qualifications Experience working in data security / cybersecurity domain Knowledge of MLOps...Contract work
- ...~7+ years of experience in Software Engineering ~4+ years building big data pipelines ~4+ years of experience with: Apache Spark (PySpark / Spark SQL) Hive and Iceberg tables SQL / SQL Server or other RDBMS ~ Strong programming experience in: Python PySpark...Contract workRemote work
- ...Responsibilities : Expertise in big data processing, Core Java and Apache spark particularly within finance domain. Should have a strong experience working with financial instruments, market risk and large-scale distributed computing systems. Develop...
- ...related discipline Minimum of 5+ years of relevant industry experience Minimum 5 years of experience developing in Hadoop echo system ( Spark, PySpark, MapReduce, Hive, Impala ) Minimum 5 years of experience with common application frameworks (JEE Spring Boot, Struts,...Remote work
- ...generation, and intelligent automation. - Work with large structured and unstructured datasets using tools like Pandas, NumPy, and Spark. - Implement models in production environments using Python, TensorFlow, PyTorch, or scikit-learn. - Conduct exploratory...
- ...develop scalable data transformation pipelines using DBT Cloud Architect and implement Databricks-based data solutions (Delta Lake, Spark) Build and optimize data models (star/snowflake schemas) for analytics Develop ETL/ELT pipelines using modern data stack...
- ...specifically AWS. Preferred qualifications: Experience with MLOps tools (e.g., MLflow, Kubeflow). Knowledge of big data technologies (Spark, Hadoop, Databricks). Background in NLP, computer vision, or other advanced AI techniques. Relevant certifications (Coursera, edX,...
- ...of experience in design, implement and maintain Python, Microservice, Spring Boot, Spring Cloud, RESTful services, Kafka, Swagger, Spark, MongoDB, Flink, Ab Initio, Docker, Kubernetes technologies Strategize, Design data architecture for specific business problems...Permanent employmentFull timeFlexible hours
- ...Preferred: Experience working with data pipelines integrated with machine learning models. Exposure to big data technologies such as Spark, Kafka, Flink, or similar. Familiarity with supervised machine learning, labelled datasets, and feedback loop systems. Experience...
- ...tracking, model packaging, and deployment. Advanced experience with PySpark and distributed data processing. Experience with AWS EMR for Spark cluster management and large-scale data transformations. Solid understanding of MLOps concepts: CI/CD for ML, feature stores,...
- ...scikit-learn, NLTK, Azure ML (optional), Amazon Web Services EC2. Experience with scalable data engineering frameworks such as Apache Spark and orchestration frameworks such as Airflow, and/or experience with semantic search. Expert knowledge in conducting data analysis...Contract workWork experience placement
- ...Years of Exp.: 8+ years exp. Skills: 8+ years of ai/ml, ml algorithms, big data, sql, testing, tuning, python, java or r, ml framework; spark, matlab, data bricks, tensorflow, or scikit-learn, chatbots, nlp, image/data classification, sentiment analysis, regression analysis...Local areaRemote work