76 Ofertas de Big Data en Mexico
Senior (Big) Data Engineer
Publicado hace 6 días
Trabajo visto
Descripción Del Trabajo
Key responsibilities
+ Architect, design, and optimize scalable big data solutions for batch and real-time processing.
+ Develop and maintain ETL/ELT pipelines to ingest, transform, and synchronize data from diverse sources.
+ Integrate data from cloud applications, on-prem systems, APIs, and streaming workspaces into centralized data repositories.
+ Implement and manage **data lakes** and **data warehouses** solutions on cloud infrastructure.
+ Ensure **data consistency, quality, and compliance** with governance and security standards.
+ Collaborate with data architects, data engineers, and business stakeholders to align integration solutions with organizational needs.
Core qualifications
+ Proficiency in **Python, Java, or Scala** for big data processing.
+ **Big Data Frameworks:** Strong expertise in **Apache Spark** , Hadoop, Hive, Flink, or Kafka.
+ Hands-on experience with data modeling, data lakes ( **Delta Lake** , Iceberg, Hudi), and data warehouses ( **Snowflake** , Redshift, BigQuery).
+ **ETL/ELT Development: Expertise with tools like Informatica, Talend, SSIS, Apache NiFi, dbt, or custom Python-based frameworks.**
+ **APIs & Integration: Strong hands-on experience with REST, SOAP, GraphQL APIs, and integration platforms (MuleSoft, Dell Boomi, SnapLogic).**
+ **Data Pipelines: Proficiency in batch and real-time integration (Kafka, AWS Kinesis/ Azure Event Hub/ GCP Pub/Sub).**
+ **Databases: Deep knowledge of SQL (Oracle, PostgreSQL, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) systems.**
Preferred experience
+ Expertise with at least one major cloud platform (AWS, Azure, GCP).
+ Experience with data services such as AWS EMR/Glue, GCP Dataflow/Dataproc, or Azure Data Factory.
+ Familiarity with containerization (Docker) and orchestration (Kubernetes).
+ Knowledge of CI/CD pipelines for data engineering.
+ Experience with OCI and Oracle Database (including JSON/REST, sharding) and/or Oracle microservices tooling.
How we'll assess
+ Systems design interview: architect a scalable service; justify data models, caching, and failure handling.
+ Coding exercise: implement and optimize a core algorithm/data‑structure problem; discuss trade‑offs.
+ Code review: evaluate readability, testing, error handling, and security considerations.
+ Practical discussion: walk through a past end‑to‑end project, metrics/SLOs, incidents, and learnings.
Career Level - IC3
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing or by calling in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
¿Este trabajo es un acierto o un fracaso?
 
            
        
                                            
            
                 
            
        
                    Senior AI/Big Data Engineer
Ayer
Trabajo visto
Descripción Del Trabajo
Key responsibilities
+ Architect, design, and optimize scalable big data solutions for batch and real-time processing.
+ Develop and maintain ETL/ELT pipelines to ingest, transform, and synchronize data from diverse sources.
+ Integrate data from cloud applications, on-prem systems, APIs, and streaming workspaces into centralized data repositories.
+ Implement and manage **data lakes** and **data warehouses** solutions on cloud infrastructure.
+ Ensure **data consistency, quality, and compliance** with governance and security standards.
+ Collaborate with data architects, data engineers, and business stakeholders to align integration solutions with organizational needs.
Core qualifications
+ Proficiency in **Python, Java, or Scala** for big data processing.
+ **Big Data Frameworks:** Strong expertise in **Apache Spark** , Hadoop, Hive, Flink, or Kafka.
+ Hands-on experience with data modeling, data lakes ( **Delta Lake** , Iceberg, Hudi), and data warehouses ( **Snowflake** , Redshift, BigQuery).
+ **ETL/ELT Development: Expertise with tools like Informatica, Talend, SSIS, Apache NiFi, dbt, or custom Python-based frameworks.**
+ **APIs & Integration: Strong hands-on experience with REST, SOAP, GraphQL APIs, and integration platforms (MuleSoft, Dell Boomi, SnapLogic).**
+ **Data Pipelines: Proficiency in batch and real-time integration (Kafka, AWS Kinesis/ Azure Event Hub/ GCP Pub/Sub).**
+ **Databases: Deep knowledge of SQL (Oracle, PostgreSQL, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) systems.**
Preferred experience
+ Expertise with at least one major cloud platform (AWS, Azure, GCP).
+ Experience with data services such as AWS EMR/Glue, GCP Dataflow/Dataproc, or Azure Data Factory.
+ Familiarity with containerization (Docker) and orchestration (Kubernetes).
+ Knowledge of CI/CD pipelines for data engineering.
+ Experience with OCI and Oracle Database (including JSON/REST, sharding) and/or Oracle microservices tooling.
How we'll assess
+ Systems design interview: architect a scalable service; justify data models, caching, and failure handling.
+ Coding exercise: implement and optimize a core algorithm/data‑structure problem; discuss trade‑offs.
+ Code review: evaluate readability, testing, error handling, and security considerations.
+ Practical discussion: walk through a past end‑to‑end project, metrics/SLOs, incidents, and learnings.
Career Level - IC3
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing or by calling in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
¿Este trabajo es un acierto o un fracaso?
 
            
        
                                            
            
                 
            
        
                    Senior AI/Big Data Engineer
Ayer
Trabajo visto
Descripción Del Trabajo
We are seeking a highly skilled **Senior Big Data Engineer** to design, develop, and manage enterprise-grade data integration solutions. The ideal candidate will have extensive experience with ETL/ELT processes, API-driven integrations, and enterprise data platforms.
Key responsibilities
+ Architect, design, and optimize scalable big data solutions for batch and real-time processing.
+ Develop and maintain ETL/ELT pipelines to ingest, transform, and synchronize data from diverse sources.
+ Integrate data from cloud applications, on-prem systems, APIs, and streaming workspaces into centralized data repositories.
+ Implement and manage **data lakes** and **data warehouses** solutions on cloud infrastructure.
+ Ensure **data consistency, quality, and compliance** with governance and security standards.
+ Collaborate with data architects, data engineers, and business stakeholders to align integration solutions with organizational needs.
Core qualifications
+ Proficiency in **Python, Java, or Scala** for big data processing.
+ **Big Data Frameworks:** Strong expertise in **Apache Spark** , Hadoop, Hive, Flink, or Kafka.
+ Hands-on experience with data modeling, data lakes ( **Delta Lake** , Iceberg, Hudi), and data warehouses ( **Snowflake** , Redshift, BigQuery).
+ **ETL/ELT Development: Expertise with tools like Informatica, Talend, SSIS, Apache NiFi, dbt, or custom Python-based frameworks.**
+ **APIs & Integration: Strong hands-on experience with REST, SOAP, GraphQL APIs, and integration platforms (MuleSoft, Dell Boomi, SnapLogic).**
+ **Data Pipelines: Proficiency in batch and real-time integration (Kafka, AWS Kinesis/ Azure Event Hub/ GCP Pub/Sub).**
+ **Databases: Deep knowledge of SQL (Oracle, PostgreSQL, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) systems.**
Preferred experience
+ Expertise with at least one major cloud platform (AWS, Azure, GCP).
+ Experience with data services such as AWS EMR/Glue, GCP Dataflow/Dataproc, or Azure Data Factory.
+ Familiarity with containerization (Docker) and orchestration (Kubernetes).
+ Knowledge of CI/CD pipelines for data engineering.
+ Experience with OCI and Oracle Database (including JSON/REST, sharding) and/or Oracle microservices tooling.
How we'll assess
+ Systems design interview: architect a scalable service; justify data models, caching, and failure handling.
+ Coding exercise: implement and optimize a core algorithm/data‑structure problem; discuss trade‑offs.
+ Code review: evaluate readability, testing, error handling, and security considerations.
+ Practical discussion: walk through a past end‑to‑end project, metrics/SLOs, incidents, and learnings.
Career Level - IC4
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing or by calling in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
¿Este trabajo es un acierto o un fracaso?
 
            
        
                                            
            
                 
            
        
                    Senior AI/Big Data Engineer
Ayer
Trabajo visto
Descripción Del Trabajo
We are seeking a highly skilled **Senior Big Data Engineer** to design, develop, and manage enterprise-grade data integration solutions. The ideal candidate will have extensive experience with ETL/ELT processes, API-driven integrations, and enterprise data platforms.
Key responsibilities
+ Architect, design, and optimize scalable big data solutions for batch and real-time processing.
+ Develop and maintain ETL/ELT pipelines to ingest, transform, and synchronize data from diverse sources.
+ Integrate data from cloud applications, on-prem systems, APIs, and streaming workspaces into centralized data repositories.
+ Implement and manage **data lakes** and **data warehouses** solutions on cloud infrastructure.
+ Ensure **data consistency, quality, and compliance** with governance and security standards.
+ Collaborate with data architects, data engineers, and business stakeholders to align integration solutions with organizational needs.
Core qualifications
+ Proficiency in **Python, Java, or Scala** for big data processing.
+ **Big Data Frameworks:** Strong expertise in **Apache Spark** , Hadoop, Hive, Flink, or Kafka.
+ Hands-on experience with data modeling, data lakes ( **Delta Lake** , Iceberg, Hudi), and data warehouses ( **Snowflake** , Redshift, BigQuery).
+ **ETL/ELT Development: Expertise with tools like Informatica, Talend, SSIS, Apache NiFi, dbt, or custom Python-based frameworks.**
+ **APIs & Integration: Strong hands-on experience with REST, SOAP, GraphQL APIs, and integration platforms (MuleSoft, Dell Boomi, SnapLogic).**
+ **Data Pipelines: Proficiency in batch and real-time integration (Kafka, AWS Kinesis/ Azure Event Hub/ GCP Pub/Sub).**
+ **Databases: Deep knowledge of SQL (Oracle, PostgreSQL, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) systems.**
Preferred experience
+ Expertise with at least one major cloud platform (AWS, Azure, GCP).
+ Experience with data services such as AWS EMR/Glue, GCP Dataflow/Dataproc, or Azure Data Factory.
+ Familiarity with containerization (Docker) and orchestration (Kubernetes).
+ Knowledge of CI/CD pipelines for data engineering.
+ Experience with OCI and Oracle Database (including JSON/REST, sharding) and/or Oracle microservices tooling.
How we'll assess
+ Systems design interview: architect a scalable service; justify data models, caching, and failure handling.
+ Coding exercise: implement and optimize a core algorithm/data‑structure problem; discuss trade‑offs.
+ Code review: evaluate readability, testing, error handling, and security considerations.
+ Practical discussion: walk through a past end‑to‑end project, metrics/SLOs, incidents, and learnings.
Career Level - IC4
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing or by calling in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
¿Este trabajo es un acierto o un fracaso?
 
            
        
                                            
            
                 
            
        
                    Technical Program Manager, Big Data Analytics, Measured Work
Publicado hace 7 días
Trabajo visto
Descripción Del Trabajo
_corporate_fare_ Google _place_ Mexico _laptop_windows_ Remote eligible
**Mid**
Experience driving progress, solving problems, and mentoring more junior team members; deeper expertise and applied knowledge within relevant area.
_info_outline_
X
Please submit your resume in English - we can only consider applications submitted in this language.
Only applications of candidates with Mexican citizenship will be evaluated for this role in compliance with the provisions of Article 7 of the Federal Labor Law.
Note: Google's hybrid workplace includes remote roles.
**Remote location: Mexico.**
**Minimum qualifications:**
+ Bachelor's degree in a technical field, or equivalent practical experience.
+ 2 years of experience in program management.
+ 2 years of experience in big data and analytics.
**Preferred qualifications:**
+ Master's degree in Information Systems, Operations Research, Computer Science, Mathematics, Statistics, Engineering or a related field.
+ 5 years of experience with technical program management in a data-related domain.
+ Experience with SQL and other databases, automation or business intelligence skills (e.g. JavaScript, Python, R).
+ Experience with SAP, SQL, dashboards and advanced data analytics/statistical analysis.
+ Knowledge of Enterprise Resource Planning (ERP) systems such as SAP and SQL programming.
**About the job**
A problem isn't truly solved until it's solved for all. That's why Googlers build products that help create opportunities for everyone, whether down the street or across the globe. As a Technical Program Manager at Google, you'll use your technical expertise to lead complex, multi-disciplinary projects from start to finish. You'll work with stakeholders to plan requirements, identify risks, manage project schedules, and communicate clearly with cross-functional partners across the company. You're equally comfortable explaining your team's analyses and recommendations to executives as you are discussing the technical tradeoffs in product development with engineers.
As a Technical Program Manager in Measured Work, you will be building systems and mechanisms to obtain, process, analyze and interpret large volumes of data to help solve a problem for Google's cloud business operations. You will be partnering with the customers in and outside of the organization to build tools that provide insights into the data center floor.
The AI and Infrastructure team is redefining what's possible. We empower Google customers with breakthrough capabilities and insights by delivering AI and Infrastructure at unparalleled scale, efficiency, reliability and velocity. Our customers include Googlers, Google Cloud customers, and billions of Google users worldwide.
We're the driving force behind Google's groundbreaking innovations, empowering the development of our cutting-edge AI models, delivering unparalleled computing power to global services, and providing the essential platforms that enable developers to build the future. From software to hardware our teams are shaping the future of world-leading hyperscale computing, with key teams working on the development of our TPUs, Vertex AI for Google Cloud, Google Global Networking, Data Center operations, systems research, and much more.
**Responsibilities**
+ Work with stakeholders to define, document and implement solutions based on data.
+ Work cross-functionally with global program managers, data engineers and data scientists to understand, implement and deploy actionable enterprise data management solutions.
+ Drive improvements for both system and process to achieve data integrity, data transparency and efficient user experience.
+ Advocate a data-centric culture that embraces process excellence and data-driven decisions to achieve business objectives.
+ Promote clarity and focus on delivering incremental value to the organization.
Information collected and processed as part of your Google Careers profile, and any job applications you choose to submit is subject to Google'sApplicant and Candidate Privacy Policy (./privacy-policy) .
Google is proud to be an equal opportunity and affirmative action employer. We are committed to building a workforce that is representative of the users we serve, creating a culture of belonging, and providing an equal employment opportunity regardless of race, creed, color, religion, gender, sexual orientation, gender identity/expression, national origin, disability, age, genetic information, veteran status, marital status, pregnancy or related condition (including breastfeeding), expecting or parents-to-be, criminal histories consistent with legal requirements, or any other basis protected by law. See alsoGoogle's EEO Policy ( ,Know your rights: workplace discrimination is illegal ( ,Belonging at Google ( , andHow we hire ( .
If you have a need that requires accommodation, please let us know by completing ourAccommodations for Applicants form ( .
Google is a global company and, in order to facilitate efficient collaboration and communication globally, English proficiency is a requirement for all roles unless stated otherwise in the job posting.
To all recruitment agencies: Google does not accept agency resumes. Please do not forward resumes to our jobs alias, Google employees, or any other organization location. Google is not responsible for any fees related to unsolicited resumes.
Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also and If you have a need that requires accommodation, please let us know by completing our Accommodations for Applicants form:
¿Este trabajo es un acierto o un fracaso?
 
            
        
                                            
            
                 
            
        
                    Data Management
Publicado hace 17 días
Trabajo visto
Descripción Del Trabajo
¡Sé parte de Stefanini!
En Stefanini somos más de 30.000 genios, conectados desde 40 países, haciendo lo que les apasiona y co-creando un futuro mejor.
 
¡Seguro no te quieres quedar fuera Especialista en Data Governance !
 
Requisitos y competencias:
 
- Data Management
- Conocimiento de SQL avanzado y/o snowflake (indispensable), modelado de datos, Gobierno de datos (DAMA Book, deseable), Calidad de Datos y Metadata
- Lic. Informática / Analítica de Datos / Ingeniero en sistemas / Matemático afín
- Pensamiento analítico con enfoque en la resolución de problemas y propuesta de mejora de flujos informacionales
- Comunicación efectiva con áreas técnicas y de Negocio
- Habilidad para generar documentación referente a los metadatos y glosarios de negocio
- Capacidad de trabajar con equipos multidisciplinarios y agile
- Experiencia en el sector financiero.
 
Modalidad: Híbrida (3 días casa 2 oficina) Primer mes totalmente presencial en Santa Fe
 
¿Qué obtendrás al trabajar con nosotros?
 
- Prestaciones de Ley
- Bonos mensuales fijos por puntualidad, asistencia y apoyo a restaurante
- Vales de Despensa
- Descuentos con empresas de entretenimiento, gimnasios, escuelas de idiomas, universidades y más
- Plan de capacitación y crecimiento profesional
¿Este trabajo es un acierto o un fracaso?
 
            
        
                                            
            
                 
            
        
                    Data Management
Publicado hace 17 días
Trabajo visto
Descripción Del Trabajo
¡Sé parte de Stefanini!
En Stefanini somos más de 30.000 genios, conectados desde 40 países, haciendo lo que les apasiona y co-creando un futuro mejor.
 
¡Seguro no te quieres quedar fuera Especialista en Data Governance !
 
Requisitos y competencias:
 
- Data Management
- Conocimiento de SQL avanzado y/o snowflake (indispensable), modelado de datos, Gobierno de datos (DAMA Book, deseable), Calidad de Datos y Metadata
- Lic. Informática / Analítica de Datos / Ingeniero en sistemas / Matemático afín
- Pensamiento analítico con enfoque en la resolución de problemas y propuesta de mejora de flujos informacionales
- Comunicación efectiva con áreas técnicas y de Negocio
- Habilidad para generar documentación referente a los metadatos y glosarios de negocio
- Capacidad de trabajar con equipos multidisciplinarios y agile
- Experiencia en el sector financiero.
 
Modalidad: Híbrida (3 días casa 2 oficina) Primer mes totalmente presencial en Santa Fe
 
¿Qué obtendrás al trabajar con nosotros?
 
- Prestaciones de Ley
- Bonos mensuales fijos por puntualidad, asistencia y apoyo a restaurante
- Vales de Despensa
- Descuentos con empresas de entretenimiento, gimnasios, escuelas de idiomas, universidades y más
- Plan de capacitación y crecimiento profesional
¿Este trabajo es un acierto o un fracaso?
 
            
        
                                            
            
                 
            
        
                    Sé el primero en saberlo
Acerca de lo último Big data Empleos en Mexico !
Data Management
Hoy
Trabajo visto
Descripción Del Trabajo
¿Este trabajo es un acierto o un fracaso?
 
            
        
                                            
            
                 
            
        
                    Data Management
Hoy
Trabajo visto
Descripción Del Trabajo
¡Sé parte de Stefanini!
En Stefanini somos más de 30.000 genios, conectados desde 40 países, haciendo lo que les apasiona y co-creando un futuro mejor.
 
¡Seguro no te quieres quedar fuera Especialista en Data Governance !
 
Requisitos y competencias:
 
- Data Management
- Conocimiento de SQL avanzado y/o snowflake (indispensable), modelado de datos, Gobierno de datos (DAMA Book, deseable), Calidad de Datos y Metadata
- Lic. Informática / Analítica de Datos / Ingeniero en sistemas / Matemático afín
- Pensamiento analítico con enfoque en la resolución de problemas y propuesta de mejora de flujos informacionales
- Comunicación efectiva con áreas técnicas y de Negocio
- Habilidad para generar documentación referente a los metadatos y glosarios de negocio
- Capacidad de trabajar con equipos multidisciplinarios y agile
- Experiencia en el sector financiero.
 
Modalidad: Híbrida (3 días casa 2 oficina) Primer mes totalmente presencial en Santa Fe
 
¿Qué obtendrás al trabajar con nosotros?
 
- Prestaciones de Ley
- Bonos mensuales fijos por puntualidad, asistencia y apoyo a restaurante
- Vales de Despensa
- Descuentos con empresas de entretenimiento, gimnasios, escuelas de idiomas, universidades y más
- Plan de capacitación y crecimiento profesional
¿Este trabajo es un acierto o un fracaso?
 
            
        
                                            
            
                 
            
        
                    Student, Data Management
Hoy
Trabajo visto
Descripción Del Trabajo
**Why join us?**
At Bombardier, we design, build and maintain the world's peak-performing aircraft for the world's most discerning people and businesses, governments and militaries. We have been successful in setting the highest standards by putting our people at the heart of it all, and defining excellence, together.
Working at Bombardier means operating at the highest level. Every day, you are part of a team that delivers superior experiences and products, pushing the boundaries of what's possible in our industry and beyond. By prioritizing employee growth and development, we empower everyone to reach their full potential on their own terms, because the best work happens when you are free to be yourself and share your unique expertise.
**Our Benefits**
With our employees' well-being top of mind, we offer a comprehensive and competitive Benefits Program, which includes the following:
+ Insurance plans _(Medical, life insurance, and more)_
+ Employee Assistance Program
+ Competitive base salary
**What are your contributions to the team?**
+ Understand and execute the different Data Entry processes for Bombardier employees
+ Process different types of requests to reflect the life cycle of Bombardier employees in HR systems
+ Identify, analyze and track exceptions
+ Analyze and propose solutions detecting possible system errors
**How to thrive in this role? Skills, knowledge & experience:**
+ You have a minimum of 0-1 years of experience
+ You hold on going studies in the field of Administration or similar
+ You hold an advanced level of English language, oral and written skills
+ You have knowledge of Office-windows
+ You are detail-oriented, deadline-driven and work well under pressure with a high degree of precision
+ You can comfortably interact with employees of all levels of experience and seniority
+ You hold a basic level of French language. This is an asset
**Now that you can see yourself in this role, apply and join the Bombardier Team!**
Please note: You don't need _all_ the skills, knowledge, and experience listed to apply for this position! We're not looking for the perfect candidate, we're looking for great talent and passionate individuals.
Bombardier is an equal opportunity employer and encourages persons of any race, religion, ethnicity, gender identity, sexual orientation, age, immigration status, disability, or other applicable legally protected characteristics to apply.
**Job** Student, Data Management
**Primary Location** Aerospace Mexico (B.A.M)
**Organization** Aerospace Mexico (B.A.M)
**Shift**
**Employee Status** Temporary
**Requisition** 10129 Student, Data Management
¿Este trabajo es un acierto o un fracaso?
 
            
        
                                            
            
                