Interview with Laura Marqués

I am Head of Corporate Data & Analytics at Adevinta. I am passionate about people, leadership, and the transformative impact of data. With a strong background in various industries and a vocation as a mathematician, my mission is to empower individuals, foster a data-driven culture, and guide strategic processes through Decision Intelligence. Dynamism, energy, and curiosity are at my core. #DataLover

Let’s begin this conversation in 1999 when you decided to study Mathematics at the University of Barcelona. Did you have a clear sense of vocation back then?

Yes, absolutely. Mathematics has always been a passion for me. From a young age, I loved solving problems—it felt like playing a game. I had a natural aptitude for logical thinking, analytical ability, and spatial reasoning, which brought me a unique enthusiasm. I eagerly awaited math class every day, and when I discovered scientific calculators, it was love at first sight. I knew this was my path because it made me happy, and I felt completely in my element.

After earning your degree, you began working as a programmer while also completing a Master’s in Applied Economics. Tell me more about the types of projects you were involved in during this early stage of your career.

When I graduated in Mathematics, I wanted to move from exploring abstract concepts and complex theories to something more concrete and applied. That transition shaped my early career. At that time, math graduates were highly valued in programming because, while we knew specific programming languages, our training in logic allowed us to adapt quickly to any language.

I started in a consulting firm, developing applications for La Caixa, specifically for investment funds. My role involved creating tools that facilitated processes like buying, selling, accounting, and functionalities for ATMs. It was an enriching experience that allowed me to apply my knowledge in a practical and business-oriented context.

Between 2007 and 2009, you pursued a Master’s in Statistics and Operations Research at the Polytechnic University of Catalonia, with a thesis on Bayesian Hierarchical Models for Wind Estimation. Could you share more details about this thesis?

My thesis was an incredibly enriching and enjoyable experience. It explored Bayesian statistics applied to wind prediction in complex regions like the Strait of Gibraltar. This area presented unique challenges due to its intricate topography and weather patterns, making it a fascinating problem to tackle from a mathematical and statistical perspective.

In short, Bayesian statistics allow us to incorporate prior information into analyses, making probability estimates more dynamic and adaptable as new information becomes available. This flexibility makes it particularly useful for complex, changing environments like wind patterns.

At that time, Bayesian methods posed significant technical challenges due to their computational intensity. For my thesis, I developed C, C++, and Matlab algorithms and began working with R, which was already proving decisive for advanced statistical problems. My work focused on hierarchical Bayesian models, which are especially useful when there are multiple levels of variability or uncertainty, like wind patterns. These models enabled me to predict wind speed and direction more accurately, directly impacting renewable energy planning, such as optimizing wind energy storage or usage.

This experience solidified my passion for applying mathematical and statistical concepts to real-world problems with tangible impacts.

In 2009, you started at CENIT. For those unfamiliar, could you briefly describe the center’s mission?

CENIT, the Center for Innovation in Transport at the Polytechnic University of Catalonia, focuses on researching and solving transport and logistics challenges using scientific approaches. Its work spans urban mobility, operational optimization, and technological applications, collaborating with public and private sectors to promote more efficient and sustainable transportation.

What major projects did you undertake there, and what kind of data were you working with?

At CENIT, I participated in key projects such as dynamic origin-destination matrix estimation and travel time prediction using emerging technologies like Bluetooth and Wi-Fi. These projects involved applying the Kalman filter to analyze mobility patterns and optimize real-time traffic models, notably on corridors and highways.

We worked with data from detection technologies such as mobile devices, license plate recognition, and sensors, evaluating their quality and applicability in advanced traffic management systems (ATIS/ATMS).

 

What tools were you using at the time? The stack must have been very different from what we use today.

We mainly worked with C and C++, standard tools for implementing complex algorithms and handling intensive calculations. While we began experimenting with R for proofs of concept, its adoption in large projects was still limited, and Python didn’t yet have the presence it does today.

A notable challenge at the time was the gap between disciplines: statisticians often lacked advanced programming skills, and programmers didn’t always grasp the statistical complexities. This made hybrid profiles, with expertise in both areas, especially valuable. My experience at this intersection was crucial to my career’s evolution.

 

In 2011, you joined Caprabo’s Market Intelligence team, where you worked with the great Pier Paolo Rossi. Let’s talk about the projects and tools you used there.

I had the privilege of working under Pier Paolo Rossi’s leadership, and I’ve grown to appreciate even more his excellent work at Caprabo during that time. It was an exciting period during which we focused on improving commercial efficiency and fostering customer loyalty through data analysis from the Eroski-Caprabo customer card. We designed data-driven strategies for every stage of the customer lifecycle—from acquisition to retention and reactivation—with the goal of increasing market share and making sound strategic decisions.

We primarily used SAS, the leading tool for implementing machine learning models at the time. While its pre-built algorithms were somewhat restrictive, its environment was robust for modeling and automation. We also tested concepts in R, but integrating the two was impossible, so we had to work within SAS’s capabilities. Python was still emerging as a tool for data analysis and would only gain prominence in the years to come.

This experience helped me connect data modeling and machine learning with clear business objectives, always prioritizing the strategic impact of each project.

 

Later, you worked at Cofidis as a Customer Intelligence Consultant. Tell me about the most important projects you were involved in.

At Cofidis, I developed predictive models and strategies to optimize products for customers financing purchases, acquiring insurance, or using Cofidis cards or the contact center. We implemented models like Next Best Offer/Action (NBO/NBA), analyzing behavioral patterns to personalize offers and improve processes.

This was a pivotal period for applying machine learning to innovative financial solutions, expanding its use across sales, marketing, and customer service beyond the traditional focus on risk management.

What significant changes would you say the world of data has experienced since then?

The world of data has evolved dramatically. Data literacy has become a strategic priority, empowering teams to make informed decisions. Disciplines like Data Management and Data Governance are now essential to ensure data quality and security.

Machine learning and neural network advances, driven by interdisciplinary collaboration among mathematicians, statisticians, computer scientists, and neuroscientists, have transformed the field. For example, large language models (LLMs) like those behind generative AI have revolutionized text processing and automation.

These techniques are now democratized thanks to open-source tools like Python and R, combined with increased computational power. Building algorithms from scratch is no longer necessary; understanding how to use and apply them strategically has become the key to innovation and decision-making.

In 2017, you joined Adevinta, where you still work today. Tell me about your journey over these seven years.

I joined Adevinta as a Data Analyst 2017, working on key projects to transform data into actionable insights. Later, I took on a leadership role as Head of Data & Analytics in Real Estate, focusing on user needs, data storytelling, descriptive, prescriptive, predictive analytics, and forecasting and experimentation.

We leveraged machine learning to develop solutions such as recommendation engines, real estate valuations, a price index, and data visualization tools, significantly improving user experience and strategic decision-making. Alongside my colleague Piermacor Milione, we built a platform to deploy models into production and formed a multidisciplinary data team, integrating analysts, engineers, and data scientists to address challenges comprehensively.

In your current role, what are the main challenges you face?

As Head of Corporate Data & Analytics, I lead a large multidisciplinary team, managing advanced analytics and data projects across operations, finance, sales, and HR. A key achievement has been implementing a data mesh approach, making B2B data from CRMs, ERPs, and contact centers accessible and usable across the organization. We also provide analytics for the HR department to support strategic decision-making.

A central challenge is closing the business loop with a 360° B2B view, ensuring that data is not only accessible but also delivers tangible value. Implementing disciplines like Decision Intelligence and Embedded Analytics has been crucial to scaling analytics and machine learning, enabling more strategic decision-making across all corporate areas.

What’s your opinion on the emergence of generative AI models like LLMs? Which use cases do you find most interesting?

The emergence of generative AI models like LLMs is transforming traditional sectors and adding strategic value. Thanks to advancements in training, increased computational power, and high-quality data, these models have democratized their immense potential.

In HR, they enable real-time feedback analysis, personalized communications, and employee churn prediction, enhancing talent experiences and strategies. In Operations, they streamline processes like report generation and logistics optimization, achieving greater efficiency.

LLM-based tools have made these capabilities accessible to all, facilitating informed decisions. We are beginning this revolution with models poised to continue transforming how we work and make decisions.

In addition to your professional work, you’ve taught in several postgraduate and training programs for years. What can you share about this experience?

I love sharing knowledge, experiences, and my passion for data. Teaching allows me to give back to the community, foster a data-driven mindset, and highlight the importance of data literacy in empowering decision-making. Seeing students apply what they’ve learned to their projects and careers is incredibly gratifying, turning ideas into impactful results.

Teaching also helps me grow, as discussions and questions inspire me to deepen my understanding. A particularly special moment was when some students created a song for me, showing how teaching builds unique connections beyond technical skills.

And as if that wasn’t enough, you also dedicate time to various projects at your children’s school.

I strongly believe in the importance of education and dedicate time to projects at my children’s school. I co-lead the Environment and School Community Commission, driving initiatives like reuse and exchange spaces, efficient water use, plant care, and creating inclusive playgrounds. I’m also part of the extracurricular activities commission, where we’ve digitized management to improve service quality.

I’m particularly committed to inclusion, ensuring schools have the resources to support children with special needs and raising awareness about their importance in the community. These projects aim to create a positive impact within both the school and the wider community.

Finally, let’s talk about the gender gap in STEM. How has this issue evolved since your early career?

While the gender gap in STEM remains significant, there have been positive changes. More initiatives, programs, and visible female role models are now inspiring new generations. However, women are still underrepresented in leadership and technical roles, highlighting the need for continued efforts.

What do you think we should do to improve the current situation?

It’s essential to promote equality from the ground up: encouraging girls to explore STEM fields, creating mentorship programs, showcasing female role models, and ensuring inclusive hiring processes. We must also build work environments that value diversity and provide equitable growth opportunities for all talent, regardless of gender.

Siguiente
Siguiente

Interview with April Yoder