188 Ofertas de Big Data en Mexico
Big Data Analytics
Hoy
Trabajo visto
Descripción Del Trabajo
Importante empresa de Analisis de datos solicita:
Empresa líder en su ramo está en búsqueda de tu talento para Analista Data,
**Requisitos**:
- Indispensable licenciatura en ciencia de datos, ingenieria en inteligencia de datos o afin.
- Experiência mínima de 1 año en el área.
- Dominio de ingles
- Dominio total de las fuentes de información de la empresa.
- Integrar y sistematizar reglas de negocio pertinentes para la correcta lectura de la información.
- Integrar los softwares y aplicaciones para que se traduzcan en un sistema de reporteo automático
- Automatización de reportes para necesidades de negocio específicas
- Desarrollar los modelos de conexiones de datos para los sistemas
- Administrar los sistemas de información y bases de datos del área - Monitorear el correcto funcionamiento de las aplicaciones del área
- Extracción, manejo, análisis e interpretación de grandes volúmenes de Bases de Datos.
- Administración y manejo de bases de datos, programación y automatización de reportes
- Manejo de aplicaciones ETL tales como Alteryx, Pentaho, SSIS
- Manejo en lenguajes de programación web como Java, HTML 5, Visual Basic
- Manejo básico de plataformas de BI como MicroStrategy, Cognos o Tableau
- Manejo de automatizaciones tales como Control-M o CRON
- Manejo básico de Unix y servidores - Manejo avanzado de Excel
- Nível básico en administración de BD como SQL Server, MySQL, Oracle Competencias
- Pro activo - Estratégico - Analista - Con interés por aprender y desarrollarse
Ofrecemos
- Sueldo de $46,800 mensuales
- Horario: de semana inglesa
- Prestaciones de ley
- Prestaciones superiores a la ley a la (planta)
- Excelente ambiente de trabajo
- Oportunidad de crecimiento Interesados
postularse por este medio y adjuntar CV actualizado
**Nível de educación deseada**:
Superior - titulado
**Nível de experiência deseada**:
Nível Experto
**Función departamental**:
Dirección de empresas
**Industria**:
Educación Superior
Big Data
Hoy
Trabajo visto
Descripción Del Trabajo
Nos encontramos en búsqueda de:
**Biga data**
**Modalidad: presencial** _(no negociable)_
**Zona: Periférico Sur**
**Requisitos**:
- +3 años de experiência
- Conocimiento en Scala
- Integración de Control-M
- Conocimientos en Cloudera
- Desarrollo en Java o Python
- Automatización de procesos
- Conocimiento en Apache Spark
- Conocimiento en bases de datos SQL
**Funciones**:
- Crear procedimientos almacenados y vistas
- Optimizar consultas y mejorar el rendimiento
- Validar desarrollos en entorno productivo
- Mantener y actualizar módulos desarrollados
**Prestaciones de ley y sueldo base, crecimiento profesional**
Tipo de puesto: Tiempo completo
Sueldo: $30,000.00 - $40,000.00 al mes
Tipo de jornada:
- Lunes a viernes
- Turno de 8 horas
Lugar de trabajo: Empleo presencial
Analista Big Data
Hoy
Trabajo visto
Descripción Del Trabajo
**Analista Big Data**
Con el siguiente perfil:
- Licenciatura en informática, Ing. Matemática o afín.
- Conocimiento en transformación de datos y Data Driven.
- Conocimiento especializado en Retail.
- Manejo de: Google Analytics, Google Tag Manager y Google Ads.
- Conocimiento en plataformas de Market place.
- Languaje SQL, R y Phyton.
Actividades:
- Análisis de los resultados; investigación de las tendencias; gestión y priorización de tareas del equipo, generar insights de crecimiento en el negocio.
- Monitorear distribución del presupuesto y rendimiento de campañas publicitarias de pago y costos.
- Realizar estudios sobre la competencia y analizar datos (benchmarking).
SE OFRECE:
+Salario base $20,000 - $25,000 brutos
+Prestaciones de ley.
HORARIO PARA LABORAR:
Lunes a viernes*hrs
Zona para laborar: Nueva Industrial Vallejo
Big Data Engineer
Hoy
Trabajo visto
Descripción Del Trabajo
**About us**:
We believe that the future is digital!
We’re a global software engineering company born in **East Europe** making success stories for over 20 years. Since 1998, Itransition has served over 800 clients, 50 of them for +5 years; includes IATA, AiBUY, adidas, Wargaming, Philips, and PayPal.
The geography of our projects extends across 40 countries and counting, while our US and European offices house 3,000+ professionals working daily to deliver business value through technology
Commitment, Excellence, Passion, and Clarity drive us!
**Responsibilities**:
- Design and develop ETL pipelines
- Data integration and cleansing
- Implement stored procedures and function for data transformations
- ETL processes performance optimization
**Requirements**:
- 5+ years of Date Engineer experience
- Understanding of the DWH and ETL development principles and methodologies
- Experience with Big Data technologies such as Hive/Spark
- Understanding of the analytical and transactional processing
- Advanced experience with RDBMS or NoSQL databases
- Advanced experience with ETL platforms and tools
- Advanced in SQL querying
- Advanced in data modeling
- Experience with at least one programming language (Python preferred)
- Experience with at least one VCS (Git, SVN, etc.)
- Experience with CI/CD pipelines and tooling (Jenkins, GitLab, etc)
- English skills sufficient for spoken communication (Intermediate level and above)
**Nice to have**:
- Experience with Databricks
- Experience with Cloud DWH solutions (e.g. Azure Synapse, AWS Redshift, Snowflake)
- Experience with BI visualization tools (e.g. Power BI, SSRS, Tableau or Qlik)
**What we offer**:
- The opportunity to work for a Global Company beyond the continent.
- Long-term contract and benefits
- Christmas and vacation Bonus
- Great competitive compensation that depends on your qualification and skills
- Projects for such clients as PayPal, Wargaming, Xerox, Philips, adidas and Toyota
- Great Competitive compensation that depends on your qualification and skills
- Career development system with clear skill qualifications
- Flexible working hours aligned to your schedule
- 100% Remote job
- Internal conferences, workshops and meetups for learning and experience sharing
- More to come!
Tipo de puesto: Tiempo completo
Horario:
- Turno de 8 horas
Prestaciones:
- Aumentos salariales
- Días por enfermedad
- Horarios flexibles
- Programa de referidos
- Trabajar desde casa
- Vacaciones adicionales o permisos con goce de sueldo
Big Data Engineer
Hoy
Trabajo visto
Descripción Del Trabajo
We are currently seeking a Big Data Engineer to join our team in Guadalajara, México (MX-MEX), Mexico (MX).
Job Description for Big Data Engineer with Spark and Scala development experience:
- 10+ years of total Experience in Java/Scala.
- 5+ years hands-on experience in Hadoop programming
- Hands on experience in Java, Scala and Spark
- Hands on experience with Kafka, NiFi, AWS, Maven, Stash and Bamboo
- Hands on experience to write MapReduce jobs. - Good knowledge on spark architecture.
- Writing high-performance, reliable and maintainable code. - Good knowledge of database structures, theories, principles, and practices.
- Good understanding of Hadoop, YARN, AWS EMR
- Familiarity with data loading tools like Talend, Sqoop - Familiarity with cloud database like AWS Redshift, Aurora MySQL.
- Familiarity of Apache Zeppelin/EMR Notebook.
- Knowledge of workflow/schedulers like Oozie or Apache AirFlow.
- Analytical and problem solving skills, applied to Big Data domain - Strong exposure in Object Oriented concepts and implementation
- Proven understanding with Hadoop, HBase, Hive, and HBase - Good aptitude in multi-threading and concurrency concepts
LI-LATAM
**About NTT DATA Services**
**NTT DATA Services is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or an accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team.
Big Data Engineer
Hoy
Trabajo visto
Descripción Del Trabajo
We are looking for individuals who can design and solve any data problems using different types of databases and technologies supported within our team. We use MPP databases to analyze billions of rows in seconds. We use Spark and Iceberg, batch or streaming to process whatever the data needs are. We also use Trino to connect all different types of data without moving them around.
Besides a competitive compensation package, you’ll be working with a great group of technologists interested in finding the right database to use and the right technology for the job in a culture that encourages innovation. If you’re ready to step up and take on some new technical challenges at a well-respected company, this is a unique opportunity for you.
**Responsibilities**:
- Implement ETL/ELT processes using various tools and programming languages (Scala, Python) against our MPP databases StarRocks, Vertica and Snowflake.
- Work with the Hadoop team and optimize Hive and Iceberg tables.
- Contribute to the existing Data Lake and Data Warehouse imitative using Hive, Spark, Iceberg, Presto/Trino.
- Analyze business requirements, design and implement required data models.
**Qualifications: (must have)**
- BA/BS in Computer Science or related field.
- 1+ years of experience with MPP databases such as StarRocks, Vertica, Snowflake.
- 3+ years of experience with RDBMS databases such as Oracle, MSSQL or PostgreSQL.
- Programming background with Scala, Python, Java or C/C++.
- Strong in any of the Linux distributions, RHEL, CentOS or Fedora.
- Experience working in both OLAP and OLTP environments.
- Experience working on-prem, not just cloud environments.
**Desired: (nice to have)**
- Experience with Elasticsearch or ELK stack.
- Working knowledge of streaming technologies such as Kafka.
- Working knowledge of orchestration tools such as Oozie and Airflow.
- Experience with Spark. PySpark, SparkSQL, Spark Streaming, etc.
- Experience using ETL tools such as Informatica, Talend and/or Pentaho.
- Understanding of Healthcare data.
- Data Analyst or Business Intelligence would be a plus.
**Benefits**
- Income of MXN 65,000 monthly, before tax
- Law and higher benefits
- 100% payroll scheme
**Location in Zapopan, near plaza del sol (locations)**
**Send your resume**:
**Salary**: $50,000.00 - $65,000.00 per month
Work Location: In person
Big Data Architect
Hoy
Trabajo visto
Descripción Del Trabajo
**The Mission**:
At Caylent, a Big Data Architect works as an integral part of a cross-functional delivery team to design and implement big data solutions on the AWS cloud for our customers. You will design and document the big data and NoSQL solutions, and provide guidance to the engineers performing the hands-on implementation of your design. You will participate in daily standup meetings with your team and bi-weekly agile ceremonies with the customer. Your manager will have a weekly 1:1 with you to help guide you in your career and make the most of your time at Caylent.
**Your Assignment**:
- Work with a team to deliver top-quality data solutions on AWS for customers
- Participate in daily standup meetings and address technical issues
- Design, optimization and migration of web-scale data processing operations
- Be able to write code whenever needed and possible
- Lead and help engineers without any direct supervision
**Your Qualifications**:
- Design and implementation of at least two of these:
- ETL, Orchestration and CI/CD pipelines
- Data Lakes, Data Warehouses
- Analytics and visualization
- Design and implementation of at least two of these:
- Data processing: eg. Hadoop, Spark, EMR
- Streaming/Messaging: eg. Kafka, RabbitMQ, Kinesis
- NoSQL DBs like KeyValue stores, Document Databases, Graph Databases
- Caching: eg. Redis, Memcache
- Search: eg. ElasticSearch, Solr
- Design and implementation of at least one of these:
- Security, access controls and governance on cloud
- Experience with IaC tools such as CloudFormation, CDK, Terraform
- Experience with AWS Glue, Lambda, SDK
- Experience with CI/CD tools
- Excellent written and verbal communication skills
**Benefits**:
- 100% remote work
- Medical Insurance for you and eligible dependents
- Generous holidays and flexible PTO
- Competitive phantom equity
- Paid for exams and certifications
- Peer bonus awards
- State of the art laptop and tools
- Equipment & Office Stipend
- Individual professional development plan
- Annual stipend for Learning and Development
- Work with an amazing worldwide team and in an incredible corporate culture
Caylent is a place where everyone belongs. We celebrate diversity and are committed to creating an inclusive environment for all employees. Our approach helps us to build a winning team that represents a variety of backgrounds, perspectives, and abilities. So, regardless of how your diversity expresses itself, you can find a home here at Caylent.
Sé el primero en saberlo
Acerca de lo último Big data Empleos en Mexico !
Technical Program Manager, Big Data Analytics, Measured Work
Hoy
Trabajo visto
Descripción Del Trabajo
**Please submit your resume in English - we can only consider applications submitted in this language.**
**Minimum qualifications:**
+ Bachelor's degree in a technical field, or equivalent practical experience.
+ 2 years of experience in program management.
+ 2 years of experience in big data and analytics.
**Preferred qualifications:**
+ Master's degree in Information Systems, Operations Research, Computer Science, Mathematics, Statistics, Engineering or a related field.
+ 5 years of experience with technical program management in a data-related domain.
+ Experience with SQL and other databases, automation or business intelligence skills (e.g. JavaScript, Python, R).
+ Experience with SAP, SQL, dashboards and advanced data analytics/statistical analysis.
+ Knowledge of Enterprise Resource Planning (ERP) systems such as SAP and SQL programming.
A problem isn't truly solved until it's solved for all. That's why Googlers build products that help create opportunities for everyone, whether down the street or across the globe. As a Technical Program Manager at Google, you'll use your technical expertise to lead complex, multi-disciplinary projects from start to finish. You'll work with stakeholders to plan requirements, identify risks, manage project schedules, and communicate clearly with cross-functional partners across the company. You're equally comfortable explaining your team's analyses and recommendations to executives as you are discussing the technical tradeoffs in product development with engineers.
As a Technical Program Manager in Measured Work, you will be building systems and mechanisms to obtain, process, analyze and interpret large volumes of data to help solve a problem for Google's cloud business operations. You will be partnering with the customers in and outside of the organization to build tools that provide insights into the data center floor.
Google Research is building the next generation of intelligent systems for all Google products. To achieve this, we're working on projects that utilize the latest computer science techniques developed by skilled software developers and research scientists. Google Research teams collaborate closely with other teams across Google, maintaining the flexibility and versatility required to adapt new projects and foci that meet the demands of the world's fast-paced business needs.
**Responsibilities:**
+ Work with stakeholders to define, document and implement solutions based on data.
+ Work cross-functionally with global program managers, data engineers and data scientists to understand, implement and deploy actionable enterprise data management solutions.
+ Drive improvements for both system and process to achieve data integrity, data transparency and efficient user experience.
+ Advocate a data-centric culture that embraces process excellence and data-driven decisions to achieve business objectives.
+ Promote clarity and focus on delivering incremental value to the organization.
Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also and If you have a need that requires accommodation, please let us know by completing our Accommodations for Applicants form:
Desarrollador de Big Data
Hoy
Trabajo visto
Descripción Del Trabajo
Be a part of Stefanini !
At Stefanini we are more than 30,000 geniuses, connected from more than 40 countries, we co-create a better future.
¡Apply Big Data Engineer !
Requirements:
- 3 years of years of BIG data development experience.
- Experienc designing, developing, and operating large-scale data systems running at petabyte scale.
- Experience building real-time data pipelines, enabling streaming analytics, supporting distributed big data, and maintaining machine learning infrastructure.
- Able to interact with engineers, product managers, BI developers, and architects, providing scalable and robust technical solutions.
- Intermediate English
Essential Duties and Responsibilities:
- Design, develop, implement and tune large-scale distributed systems and pipelines that process large volume of data; focusing on scalability, low -latency, and fault-tolerance in every system built.
- Experience with Java , Python to write data pipelines and data processing layers
- Experience in Airflow & Github .
- Experience in writing map-reduce jobs.
- Demonstrates expertise in writing complex, highly-optimized queries across large data sets
- Proven, working expertise with Big Data Technologies Hadoop, Hive, Kafka, Presto, Spark, HBase.
- Highly Proficient in SQL .
- Experience with Cloud Technologies (GCP , Azure )
- Experience with relational model, memory data stores desirable (Oracle, Cassandra, Druid )
- Provides and supports the implementation and operations of the data pipelines and analytical solutions
- Performance tuning experience of systems working with large data sets
- Experience in REST API data service – Data Consumption
- Retail experience is a huge plus.
What’s in for you?
- Fully remote
- Training Path
- Life insurance
- Punctuality bonus
- Grocery vouchers
- Restaurant vouchers
- Legal benefits + Profit sharing (PTU)
- Learning and Mentoring platforms
- Discounts at language schools
- Gym discount
Big Data Intern Bilingual
Hoy
Trabajo visto
Descripción Del Trabajo
Big Data Intern Bilingual - Half Time
*Necesariamente con Seguro Facultativo*
Reports to: Product Center Manager
Location: Polanco, near to Plaza Carso
Job Type: Part-time Internship
Requirements:
1. Student of Data Science, Data Engineering, Data Analysis, Finance or related field.
2. Basic knowledge of Data Analysis or related.
3. Communication and teamwork skills.
What We Offer:
1. Opportunity to gain experience in a creative and dynamic environment.
2. Chance to develop Data Analysis skills and knowledge.
3. Teamwork with experienced professionals in the field.
How to Apply:
If you are interested in this position, please send your resume in English
Pls send updated CV in English
*Necesariamente con Seguro Facultativo*