Beate van Loo-Born | CFO & Board Member | Sustainability | Finance & Data | Keynote Speaker & Mentor | Author | Triathlete | PhD Candidate.

Over the last decade or so, data has been called “the new oil” and seen by many as this versatile resource of infinite value fueling our economy.

Consequently, data houses like Bloomberg, S&P and Refinitiv have risen to fame, and organizations in all sectors have started to meticulously collect, protect and harvest their own data sets. The profession of the chief data officer (CDO) made a strong appearance on the job market, and with every new computing technology—the latest being AI—more and more of this “new age gold” is being organized, structured, stored and used.

Why is data so special? Because it gives us the power to shape a narrative.

Data’s Impact Across Industries

Data has redefined industries, informed international policy development and personalized shopping experiences. But with every new application comes a choice—whether to use data to empower or to manipulate, to inform or exploit.

Take political data, for example. The 2012 presidential campaigns were some of the first to use social media in an effort to engage voters, build trust and increase participation—then seen as a positive, collaborative instrument. Then just a few years later, the Cambridge Analytica scandal revealed how data could be used manipulatively, as millions of Facebook users’ data was harvested without consent.

Financial services holds another example. Before the 2007 financial crisis, complex products like mortgage-backed securities were often celebrated as data-driven and sophisticated, but the real risks were obscured in pursuit of short-term returns. This contributed to a global financial crisis.

Having worked on implementing financial regulations after the crisis, I saw firsthand the importance of integrity and oversight. Unregulated data use in the pursuit of profit can lead to adverse consequences. Interestingly, after the crisis, the same data was used to stabilize financial markets (e.g., transaction reporting, investor risk profiles). Hence, for finance professionals, balancing profit with transparency and exposing risks with integrity is essential.

In climate action, data has been abused as well as used for good. When presented accurately, climate data has driven agreements, like the Paris Agreement. However, some have framed data to downplay global warming, creating “uncertainty” and delaying climate action through misrepresentation.

Consider how personal consumer data is used in marketing. Take Target’s use of purchase data to predict customer needs. It was innovative, but when it revealed a teenage girl’s pregnancy to her family, the perception shifted from “insightful” to “invasive.” For marketing professionals, the challenge is balancing personalization with privacy, respecting individual boundaries even when data offers powerful predictive abilities.

Keeping Up With Advances In Technology

Data professionals today face a rapidly evolving tech landscape where data complexity and innovation are accelerating. They are constantly balancing the pressure to adopt new tools with the need for ethical scrutiny. Rapid advancements, including the following technologies, demand constant vigilance and a commitment to transparency, responsibility and understanding.

• Interconnected systems: The amount of data generated is increasing, and data systems are becoming more interconnected, amplifying both the benefits and the risks of data utilization.

• AI: AI allows for faster analysis and more accurate predictions. Its complexity is, however, often a “black box” and poses challenges in ensuring fair and truthful outcomes.

• Emerging technologies: Technologies like edge computing and quantum computing introduce new capabilities but also new risks. Edge computing decentralizes data processing, making it more challenging to control, while quantum computing could revolutionize data but bring unknown ethical implications.

Regulations And Taking Responsibility

Advancements often outpace regulation, leaving many ethical decisions to individuals and organizations. I think this “regulation gap” means data professionals have a duty to self-regulate.

One might argue that the regulatory process as it exists today (correction of market behaviors triggered by a crisis based on lengthy administrative processes) is no longer fit for purpose. Technologies advance too fast to staff regulatory bodies with experts that can create meaningful regulations.

Rather than updating binding legal texts, it might make more sense to provide a guiding principle as a moral compass or a “spirit of the regulation” to help organizations self-regulate and take accountability for assessing what is right or wrong.

I think a good example to follow would be the concept of dual materiality (or double materiality), from the ESG space, where the goals are to:

1. Ensure you are not harming the environment (e.g., pollution, CO2 emissions, modern slavery)

2. Ensure that the environment is not harming you (e.g., floods, droughts, social and economic instability)

With this approach, business leaders can balance the good with the bad, and their own interests with their stakeholder’s interests. In the end, the result needs to be net positive (maximum economic benefit for the organization with the minimum negative impact on the stakeholder base).

There are some examples of what this might look like in the data space:

• Transparent AI: There’s a growing call for transparent, accountable AI models to prevent harm.

• Ethical data governance: Ethical standards are emerging across industries, focusing on aligning data practices with societal values.

• Data for social good: When used responsibly, data can drive positive outcomes, from climate resilience to healthcare improvements.

We’ve been given revolutionary technology—including AI, quantum computing, edge computing—that has the ability to self-improve in a way that surpasses our ability to understand how it develops. While it is tempting to simply believe whatever the LLM produces, for example, we need to realize that a machine still lacks common sense, a very human trait we often take for granted.

We can never delegate responsibility fully to technology, which means we need to at least make sense of the outcome (check for bias, ensure transparency, challenge outliers) and use it responsibly when shaping opinions, concluding on shopping preferences, sharing private information and more.

Forbes Business Council is the foremost growth and networking organization for business owners and leaders. Do I qualify?

Read the full article here

Share.