27 Ofertas de Desarrollador de Datos en Mexico
Enterprise Data Warehouse Sustain
Hoy
Trabajo visto
Descripción Del Trabajo
- At PepsiCo, we’re redefining what it means to be a consumer products company with a digital-first mindset, and our Global IT team is leading that charge. Our technology teams unlock digital capabilities, enhance cybersecurity safeguards, deliver data-driven insights, and create unmatched consumer and customer experiences.
Our culture is guided by the PepsiCo Way, which is a set of values that define our mission to win with purpose in the marketplace and act with integrity in everything we do. We’re creating smiles with every sip and every bite while advancing a sustainable, socially impactful agenda that promotes our goal of Winning with Purpose.
The Global IT team supports just that with our mission to create more smiles with every CLICK and every LIKE. Join our global, agile team and help us influence and drive PepsiCo’s digital transformation!
**Responsibilities**
**What will you be doing?**
- This role will be used for provisioning Azure PaaS platform and projects within the Enterprise Data and Analytics team's sustain organization- Focus on enterprise data warehouse stability and drive incident reduction for PepsiCo's enterprise data warehouse solutions on Teradata and Azure Synapse- Monitor critical ETL load status and ensure timely status communication- Track incidents and requests and provide daily analytical view to your manager- Take ownership of support tasks and complete assigned tasks on schedule- Response to priority incidents, communicate to business users, collaborate with various IT teams to drive problem closure- Investigate issues such as data mismatch between the systems, job failures; find the fix and document solution- Create and maintain strong working relationships with business users, EDW support and other IT teams- Use ServiceNow for Service Request, Incidents and Change Request management and follow PepsiCo ITSM policies
**Qualifications**
**Who are we looking for?**- Graduated Bachelor's or master’s degree in Computer Science, Engineering, or a related field.- Strong analytical and problem-solving & Excellent communication and interpersonal skills-
**What can you expect from us?**- Competitive compensation package- A flexible work environment that promotes a healthy balance between personal and professional life- A dynamic and inclusive culture- A supportive team that will foster your professional growth and development- Opportunity to work with relevant projects worldwide- Opportunity to give back to the community with our volunteer programs
Consultor Data Warehouse Bilingue
Hoy
Trabajo visto
Descripción Del Trabajo
**Position: Data Warehouse consultant**
**Location: México**
**Industry - Sector: Financial Services - Banking**
**What you’ll do?**
- Data management
- Work with data warehouse
- Working with high volumes of information
- Data structure management
- Projects with agile methodologies
**What you’ll bring**:
- Data warehouse
- knowledge of agile methodology
- Data structure management
- knowledge of agile methodology
**Soft skills**:
Work Underpressure, Quality at work, Results Oriented
**What can YOU expect in a career with Capgemini?**
- Working in a team environment, Consultants will focus on the analysis, design and development of technology-based solutions for Capgemini’s clients.
- You will work alongside technical, functional and industry specialists to assist with the development, implementation and integration of innovative system solutions including methods, techniques and tools.
- You will contribute to client satisfaction by providing timely and responsive value-added services and work products.
- Capgemini offers a competitive compensation and benefits package.
- Headquartered in Paris, France, Capgemini has a presence of more than 340 thousand professionals in Mexico distributed among 3 sites located in Mexico City, Monterrey and Aguascalientes. A deeply multicultural organization.
- Capgemini has developed its own way of working, the Collaborative Business ExperienceTM, and draws on Rightshore, its worldwide delivery model.
**You will love this job because**
- Capgemini focuses on giving each new hire a YOU-nique experience through our recruitment process and on-boarding program, as well as by helping you to build your own career and professional skills foundation.
- Capgemini provides a collaborative environment that embodies and holds the following stated values close to heart: Honesty, Boldness, Trust, Freedom, Team Spirit, Modesty, and Fun.
- Capgemini cultivates an atmosphere for development that enables YOU to be hands-on, planning for your growth, both horizontally and vertically.
Desarrollador Python - Análisis de Datos
Ayer
Trabajo visto
Descripción Del Trabajo
**Responsabilidades**:
- Desarrollar y mantener **scripts en Python** para procesar, limpiar y analizar grandes volúmenes de datos.
- Diseñar y crear **dashboards interactivos** y **reportes automatizados** que faciliten la toma de decisiones empresariales.
- Colaborar con equipos multidisciplinarios (producto, ingeniería, negocio) para identificar necesidades de datos y crear soluciones analíticas eficientes.
- Optimizar consultas y procesos de análisis de datos para mejorar el rendimiento de las plataformas de BI.
- Implementar mejores prácticas en la **gestión de datos**, asegurando su calidad, integridad y accesibilidad.
- Realizar análisis de datos para detectar patrones, tendencias y posibles áreas de mejora en los procesos de negocio.
- Participar en la definición de **requerimientos de datos** y en la creación de soluciones técnicas alineadas con los objetivos empresariales.
**Requisitos**:
- **Más de 2 años de experiência** en desarrollo de **Python** para análisis de datos.
- Experiência en la manipulación y transformación de datos utilizando **pandas**, **NumPy** y otras bibliotecas de Python.
- Conocimiento avanzado de **SQL** para trabajar con bases de datos relacionales.
- Familiaridad con herramientas de visualización de datos como **Power BI**, **Tableau** o **Looker**.
- Experiência en la creación de **modelos de datos** y en la optimización del rendimiento de consultas.
- Conocimientos en plataformas de análisis de datos en la **nube** (AWS, GCP, Azure).
- Habilidad para trabajar en **entornos ágiles** y colaborar con equipos técnicos y no técnicos.
- Capacidad para presentar **hallazgos de datos** de manera clara y efectiva a audiencias no técnicas.
**Habilidades deseadas**:
- Experiência con herramientas de **gestión de datos** como **Airflow**, **DBT** o **Dataform**.
- Conocimientos en integración de datos a través de **APIs** o plataformas externas como **Google Sheets**.
- Familiaridad con principios de **Data Governance** y **Seguridad de Datos**.
- Experiência trabajando en proyectos con **metodologías ágiles** (Scrum, Kanban).
**Ofrecemos**:
- Salario competitivo y PSL
- Oportunidad de crecimiento profesional en un entorno dinámico e innovador.
- Trabajo colaborativo con equipos multidisciplinarios.
- Oportunidad de impactar directamente en las decisiones estratégicas de la empresa.
Si tienes experiência en el desarrollo de soluciones analíticas utilizando Python y te apasiona trabajar con datos para mejorar los procesos empresariales, ¡te invitamos a postularte!
Tipo de puesto: Tiempo completo
Sueldo: $32,000.00 - $35,000.00 al mes
Beneficios:
- Caja de ahorro
- Seguro de vida
- Vales de despensa
Tipo de jornada:
- Turno de 8 horas
Lugar de trabajo: Empleo presencial
Big Data Engineer
Hoy
Trabajo visto
Descripción Del Trabajo
- Ingles avanzado
- +5 años de experiência en desarrollo de BIG data
- Experiência con Python para escribir canalizaciones de datos y capas de procesamiento de datos
- Amplia experiência y manejo SQL, optimizadas para la transformación de datos y la implementación de reglas comerciales.
- Demuestra experiência actualizada en ingeniería de datos y desarrollo de canales de datos complejos.
- Experiência en modelos ágiles
- Diseñar, desarrollar, implementar y ajustar sistemas y canalizaciones distribuidos a gran escala que procesen grandes volúmenes de datos; centrándose en la escalabilidad, la baja latencia y la tolerancia a fallos en cada sistema construido.
- Experiência con Java, Python para escribir canalizaciones de datos y capas de procesamiento de datos.
- Experiência en Airflow y Github.
- Experiência en redacción de trabajos de reducción de mapas.
- Demuestra experiência en la redacción de consultas complejas y altamente optimizadas en grandes conjuntos de datos.
- Experiência laboral comprobada con tecnologías de Big Data Hadoop, Hive, Kafka, Presto, Spark, HBase.
- Altamente competente en SQL.
- Experiência con Tecnologías en la Nube (GCP)
- Experiência con modelo relacional, almacenes de datos de memoria deseables (Oracle, Cassandra, Druid)
- Proporciona y respalda la implementación y las operaciones de los canales de datos y soluciones analíticas.
- Experiência en ajuste del rendimiento de sistemas que trabajan con grandes conjuntos de datos.
- Experiência en servicio de datos REST API - Consumo de Datos
- Experiência trabajando con almacenamiento distribuido de objetos en entornos de nube.
- Experiência laboral con Big Data Technologies Hadoop, Hive, Kafka, Presto, Spark, HBase, Automic y Aorta
Actividades
- Capacidad para guiar a los equipos de ingeniería de datos.
- Capacidad para comprender los requisitos finales y analizar las soluciones de principio a fin.
- Demuestra experiência actualizada en ingeniería de datos, desarrollo de canalizaciones de datos complejas
- Diseñar, desarrollar, implementar y ajustar sistemas distribuidos a gran escala y canalizaciones que procesan grandes volúmenes de datos; centrándose en la escalabilidad, la baja latencia y la tolerancia a fallos en cada sistema construido.
Habilidades blandas
1. Comunicación: puede explicar problemas de ingeniería con socios y universidades, escribir documentación fácil de entender y utilizar eficazmente los diferentes
2. Trabajo en equipo: Integración colaborativa con diferentes áreas para resolver desafíos de datos ejecutando reuniones multifuncionales efectivas, dando y recibiendo retroalimentación.
Senior Big Data Architect
Hoy
Trabajo visto
Descripción Del Trabajo
Transformative Data Engineer
At Synechron, we believe in the power of digital to drive innovation. Our global consulting firm combines creativity and cutting-edge technology to deliver industry-leading solutions. We have a proven track record of delivering transformative projects for top financial services firms, leveraging our expertise in Artificial Intelligence, Cloud & DevOps, Data, and Software Engineering.
We are seeking an experienced Data Engineer with strong expertise in Databricks to join our data team. The ideal candidate will design, implement, and optimize large-scale data pipelines, ensuring scalability, reliability, and performance.
Key Responsibilities:
- Design and develop scalable ETL/ELT pipelines using Databricks, ensuring seamless integration with Azure Cloud Services.
- Develop complex data processing workflows using PySpark/Spark and SQL, leveraging expertise in big data processing.
- Implement data security, access controls, and governance standards using Unity Catalog, ensuring compliance with organizational and regulatory policies.
- Collaborate closely with cross-functional teams to understand business requirements and deliver tailored data solutions.
- Work with multiple stakeholders to prepare data for dashboard and BI Tools, driving business value through data insights.
Requirements:
- Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.)
- Proficiency in Azure Cloud Services, including Azure Blob Storage, ADLS, and relational/non-relational systems.
- Solid understanding of Spark and PySpark for big data processing, including experience in developing real-time data solutions.
- Experience in developing data pipeline solutions using Databricks Asset Bundles and GitLab.
- Azure Data Engineer Associate or Databricks certified Data Engineer Associate certification (optional).
Benefits:
- A highly competitive compensation package.
- A multinational organization with 58 offices in 21 countries, offering opportunities for career growth and professional development.
- Extensive training opportunities focused on skills, substantive knowledge, and personal development.
- Laptop/equipment.
- 12 days of paid annual leave (plus sick leave and national holidays).
- Maternity & Paternity leave plans.
- A comprehensive insurance plan including medical, dental, vision, and long-/short-term disability plans.
- Retirement savings plans.
- A higher education certification policy.
Big Data Solution Designer
Hoy
Trabajo visto
Descripción Del Trabajo
Job Description:
We are seeking an experienced Data Engineer to join our team.
- Design and implement a solution for processing big data volumes.
- Integrate the solution with the current architecture.
The ideal candidate will have:
- Strong expertise in designing and implementing data models.
- Experience with cloud-based analytical databases.
- Proficiency in SQL to perform complex queries, data transformations, and performance tuning.
- Experience integrating metadata and governance processes into cloud-based data platforms.
- Knowledge of Azure services, including Azure Data Lake Storage, Azure Synapse Analytics, and Azure Databricks.
In this role, you will have the opportunity to work on a large retail company's eCommerce platform modernization project. You will be responsible for designing a solution that supports processing big data volumes and integrates with the current architecture.
- Collaborate with cross-functional teams to ensure successful implementation.
- Develop and maintain high-quality code.
- Work closely with stakeholders to understand requirements and provide solutions.
Big Data Solutions Architect
Hoy
Trabajo visto
Descripción Del Trabajo
Job Description
">We are seeking a highly skilled Databricks Engineer to join our data team. The ideal candidate will design, implement and optimize large-scale data pipelines, ensuring scalability, reliability and performance.
">Responsibilities:
">- ">
- Design, implement and maintain scalable ETL/ELT pipelines using Databricks. ">
- Leverage PySpark/Spark and SQL to transform and process large datasets. ">
- Integrate data from multiple sources including Azure Blob Storage, ADLS and other relational/non-relational systems. ">
Requirements:
">- ">
- Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.) ">
- Proficiency in Azure Cloud Services. ">
- Solid understanding of Spark and PySpark for big data processing. ">
- Experience in relational databases. ">
- Knowledge on Databricks Asset Bundles and GitLab. ">
- Familiarity with Databricks Runtimes and advanced configurations. ">
- Knowledge of streaming frameworks like Spark Streaming. ">
- Experience in developing real-time data solutions. ">
- Azure Data Engineer Associate or Databricks certified Data Engineer Associate certification (optional). ">
Benefits:
">- ">
- Highly competitive compensation and benefits package. ">
- Multinational organization with 58 offices in 21 countries and the possibility to work abroad. ">
- Laptop/equipment. ">
- 12 days of paid annual leave (plus sick leave and national holidays). ">
- Maternity & Paternity leave plans. ">
- Comprehensive insurance plan including medical, dental, vision, and long-/short-term disability. ">
- Retirement savings plans. ">
- Higher education certification policy. ">
- Extensive training opportunities, focused on skills, substantive knowledge, and personal development. ">
- On-demand Udemy for Business for all employees with free access to over 5,000 curated courses. ">
- Coaching opportunities with experienced colleagues from Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups. ">
- Cutting-edge projects at leading tier-one banks, financial institutions and insurance firms. ">
- Flat and approachable organization. ">
- Diverse, fun-loving and global work culture. ">
Other Opportunities:
">- ">
- Global workforce of 14,500+ employees. ">
- 58 offices in 21 countries worldwide. ">
- Multi-cultural and dynamic work environment. ">
- Many career growth opportunities. ">
- Regular learning and development opportunities. ">
Sé el primero en saberlo
Acerca de lo último Desarrollador de datos Empleos en Mexico !
Desarrollador Big Data Jr
Hoy
Trabajo visto
Descripción Del Trabajo
**Conocimientos**:
Experiência afín de al menos 6 meses (Servicio social, prácticas profesionales o posiciones similares)
- Programación orientada a objetos
- Conocimiento en Base de datos Relacionales (Oracle, Teradata / SQL)
- Conocimiento en lenguajes de programación orientada a objetos como Java / C# / Python
- Concomimiento en leguaje de programación funcional (Scala)
- Hadoop / Spark
- GitHub"
- Control M
- GIT Alto
- Herramientas DevOps /Jenkins
- Limpieza y Manipulación de Datos
- Modelos de machine learning (mínimo Regresiones logísticas y Redes Neuronales Convolucionales)
- Conocimiento sobre sistemas Linux/UNIX a nível línea de comando (POSIX: awk, grep, etc).
- Conocimiento sobre RegEx
- Capacidad de redacción y comunicación de ideas.
**Sus principales funciones serán**:
- Generación o desarrollo de procesamiento de datos con dependencias o librerías que de valor al negocio por medio del análisis, desarrollo, calidad del dato, mejora del performans.
- Desarrollo de código con Spark combinado con Scala / Python
- Comprensión de calidad del dato.
- Generación de Dicccionariados
- Desarrollo de software con buenas prácticas.
- Conocimiento intermedio de shell de Linux/UNIX: Conocimiento del POSIX (awk, grep, etc) y RegEx.
- Capacidad de análisis y pensamiento crítico.
- Capacidad de relacionar información técnica con información del negocio.
- Comunicación entre la parte técnica y la parte funcional (redacción de minutas, elaboración de documentación).
**Formación**
Estudios mínimos: Ingeniería en computación, licenciatura en informática, matemáticas o afines.
Tipo de puesto: Tiempo completo
Salario: Hasta $25,000.00 al mes
Beneficios:
- Vacaciones superiores a las de ley
- Vales de despensa
Horario:
- Turno de 8 horas
Experiência:
- Java: 1 año (Obligatorio)
- SQL: 1 año (Obligatorio)
- HTML: 1 año (Obligatorio)
Big Data Enginer (Backend Sr.)
Hoy
Trabajo visto
Descripción Del Trabajo
Estamos acreditados en el Nível 5 de CMMI (Capability Maturity Model Integration), modelo de calidad establecido por el Software Engineering Institute (SEI). Certificados por parte de ISO en la norma ISO/IEC 27001:2013. Contamos con distintivo de Empresa Socialmente Responsable (ESR).
¡¡TE ESTAMOS BUSCANDO!
Big Data Engineer (Backend Sr)
**Requisitos**:
- Licenciatura - Ingeniería en Sistemas, Actuaria, Matemáticas, o Afin
- Min 3 años de experiência en desarrollo de BIG data
- Demuestra experiência actualizada en ingeniería de datos, desarrollo de canalizaciones de datos complejas
- Diseñar, desarrollar, implementar y ajustar sistemas distribuidos a gran escala y canalizaciones que procesan grandes volúmenes de datos; centrándose en la escalabilidad, la baja latencia y la tolerancia a fallos en cada sistema construido.
- Experiência con Java, Python para escribir canalizaciones de datos y capas de procesamiento de datos
- 3-4 años de experiência en análisis de negocios centrada en el dominio de datos y sistemas de inteligencia empresarial
- Altamente competente en lenguaje SQL.
- Experiência con tecnologías en la nube (GCP, Azure)
- Experiência laboral con Big Data Technologies Hadoop, Hive, Kafka, Presto, Spark, HBase, Automic y Aorta
- 3-4 años de experiência reciente entrevistando a usuarios de negocios, recopilando informes y requisitos de KPI
- Experiência demostrable de trabajo con datos (análisis, redacción de consultas, unión de diferentes fuentes de datos para darles sentido) para poder validar los requisitos
- Experiência comprobada en la documentación de requisitos, flujos de trabajo, mantenimiento de wiki
- Capacidad para participar y realizar sesiones de requisitos trabajando con usuarios de negocios, presentar los resultados finales
- Experiência traduciendo requerimientos para ingenieros de datos y BI para soluciones, participando en el desarrollo de soluciones
- Construir modelos analíticos para apoyar las iniciativas empresariales
- Cree, entregue y analice métricas clave utilizando paneles e informes
- Experiência con procesamiento de transmisión de datos
- Experiência con herramienta de gestión de metadatos como MITI, herramienta de monitoreo como Ambari
- Experiência en el desarrollo de servicios de datos API REST
- La experiência minorista es una gran ventaja
Localidad:
- Ciudad de México (Modalidad 100% HO).
Ofrecemos:
- Convenios comerciales
- Prestaciones de Ley y Superiores
- Descuentos en certificaciones
Somos una empresa incluyente y libre de cualquier tipo de discriminación, donde las oportunidades son para quien muestra actitud y talento. No solicitamos certificados médicos de no embarazo y Virus de Inmunodeficiencia Humana (VIH) como requisitos para que formes parte de nuestra gran familia de trabajo y puedas acceder a las mismas oportunidades de crecimiento
Technical Program Manager, Big Data Analytics, Measured Work
Ayer
Trabajo visto
Descripción Del Trabajo
**Please submit your resume in English - we can only consider applications submitted in this language.**
**Minimum qualifications:**
+ Bachelor's degree in a technical field, or equivalent practical experience.
+ 2 years of experience in program management.
+ 2 years of experience in big data and analytics.
**Preferred qualifications:**
+ Master's degree in Information Systems, Operations Research, Computer Science, Mathematics, Statistics, Engineering or a related field.
+ 5 years of experience with technical program management in a data-related domain.
+ Experience with SQL and other databases, automation or business intelligence skills (e.g. JavaScript, Python, R).
+ Experience with SAP, SQL, dashboards and advanced data analytics/statistical analysis.
+ Knowledge of Enterprise Resource Planning (ERP) systems such as SAP and SQL programming.
A problem isn't truly solved until it's solved for all. That's why Googlers build products that help create opportunities for everyone, whether down the street or across the globe. As a Technical Program Manager at Google, you'll use your technical expertise to lead complex, multi-disciplinary projects from start to finish. You'll work with stakeholders to plan requirements, identify risks, manage project schedules, and communicate clearly with cross-functional partners across the company. You're equally comfortable explaining your team's analyses and recommendations to executives as you are discussing the technical tradeoffs in product development with engineers.
As a Technical Program Manager in Measured Work, you will be building systems and mechanisms to obtain, process, analyze and interpret large volumes of data to help solve a problem for Google's cloud business operations. You will be partnering with the customers in and outside of the organization to build tools that provide insights into the data center floor.
Google Research is building the next generation of intelligent systems for all Google products. To achieve this, we're working on projects that utilize the latest computer science techniques developed by skilled software developers and research scientists. Google Research teams collaborate closely with other teams across Google, maintaining the flexibility and versatility required to adapt new projects and foci that meet the demands of the world's fast-paced business needs.
**Responsibilities:**
+ Work with stakeholders to define, document and implement solutions based on data.
+ Work cross-functionally with global program managers, data engineers and data scientists to understand, implement and deploy actionable enterprise data management solutions.
+ Drive improvements for both system and process to achieve data integrity, data transparency and efficient user experience.
+ Advocate a data-centric culture that embraces process excellence and data-driven decisions to achieve business objectives.
+ Promote clarity and focus on delivering incremental value to the organization.
Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also and If you have a need that requires accommodation, please let us know by completing our Accommodations for Applicants form: