Why Selling Hope Is So Profitable Online and How People Are Misled

The internet has made it easier than ever to buy solutions to personal problems. From improving health and mindset to finding love, success or purpose, there is no shortage of people online offering answers.

Advice and guidance are not the problem. Many people find real comfort in reflection, coaching and shared experience.

The issue arises when hope is packaged as certainty, and authority is implied without being earned. Particularly when misleading claims or exaggerated qualifications are used to secure financial gain. And it is not always obvious when claims cross that line.

Selling hope has become highly profitable online. And increasingly, the ethical lines are being crossed.

The business of hope and certainty

Hope is powerful. It helps people cope with uncertainty, illness, grief and change. But online, hope is often monetised by turning possibilities into promises.

This is where problems begin.

Psychics, for example, offering guidance or symbolic insight is not inherently unethical. What crosses the line is when listings promise specific outcomes, such as making someone fall in love with you, curing an illness, removing a curse or guaranteeing success.

Even when disclaimers, often buried at the bottom of listings, say “for entertainment only”, the titles, imagery and language often imply certainty if money is exchanged.

That gap between the disclaimer and the marketing is where false hope thrives.

When spiritual services become exploitation

Online platforms have normalised selling spells, psychic readings and spiritual interventions. Many buyers treat these as reflective or symbolic. Some understand they are for fun and entertainment.

However, many others, including vulnerable people, believe the outcomes are real and expect the promised results to materialise.

The ethical issue is not belief itself, but how belief is used.

A clear example of how this can escalate is the Maria Duval psychic scam. Maria Duval was presented as a famous psychic who sent personalised letters claiming unique insight into people’s lives. Recipients were warned of looming danger or promised life-changing fortune, but only if they sent money quickly.

Many victims paid repeatedly, forming emotional attachments to a persona that did not truly exist. In reality, Maria Duval was a fabricated identity used by an organised operation. The scam extracted millions of dollars worldwide.

In 2018, those behind the operation were prosecuted in the United States, receiving prison sentences and large fines. The case was significant because it demonstrated that belief-based services can move beyond individual sellers into systematic, industrialised exploitation.

False authority in wellness spaces beyond spirituality

The same pattern appears outside spiritual services, particularly in wellness, leadership and self-development spaces. Here, the product is not belief in magic, but belief in expertise.

A recent example is the Tomasz Drybala scam.

Drybala built a public profile as an inspirational ultra-endurance athlete, claiming to run extreme distances for charity, having sponsors, and raising large sums for organisations such as UNICEF. He was booked for talks, invited to speak, and paid by individuals and organisations who believed his story.

Investigations later revealed serious inconsistencies. There was no evidence to support the fundraising claims he made – the charities he was supposedly raising money for had never received any donations.

He was reported to have used Photoshopped images of himself and used a testimonial on his website from a well-known figure in the industry that was found to be simply made up.

People who paid him for speaking engagements reported cancellations without refunds. Others who were hired by him say they were never paid for PR, marketing and support work, leaving them financially out of pocket.

After his endurance claims were publicly challenged, Drybala reportedly reinvented himself again, this time presenting as a neuroscience and leadership expert. He began selling expensive programmes while falsifying links with universities, creating the impression of academic backing without evidence that such affiliations existed.

You can read the Daily Mail investigations here:

This matters because it shows how authority can be rebuilt and resold, even after previous claims are exposed.

The power of professional titles

Another well-known UK example is Gillian McKeith, a nutritionist and television personality whose use of the title “Dr” caused widespread controversy.

Gillian McKeith is not a medical doctor. While she holds training and qualifications in nutrition, including a University of Edinburgh degree, her PhD came from an unaccredited distance learning institution. In 2007, the Advertising Standards Authority ruled that her use of the “Dr” title in advertising was misleading.

She agreed to stop using the title in promotional materials.

The Guardian covered the case here.

The case illustrates how titles alone create trust, even when no medical or regulated qualification exists. Many people reasonably assumed “Dr” implied medical expertise. Regulators agreed that this assumption mattered.

Authority cues and why people believe them

These cases work because of what psychologists call authority cues. These are signals that make someone appear credible without needing evidence.

Common authority cues online include:

  • professional-looking websites and branding
  • references to universities or institutions
  • testimonials and social proof
  • confident language and certainty
  • impressive titles that sound academic or scientific

Most people do not have the time or tools to verify every claim. When authority cues are combined with emotional need, scrutiny drops.

Why this problem will likely worsen with AI

This issue is likely to accelerate.

AI tools already allow people to generate convincing websites, credentials, testimonials and expert language in minutes. The next stage includes realistic fake videos, images and voice cloning, making it harder to tell whether someone is who they claim to be.

AI-generated personas can appear consistent, knowledgeable and credible across multiple platforms. As these tools become cheaper and more accessible, false authority will be easier to create and harder to challenge.

This makes critical thinking and verification more important than ever.

How to protect yourself

Scepticism does not mean cynicism. It means pausing.

Helpful steps include:

  • checking whether qualifications are recognised and regulated
  • being wary of guaranteed outcomes and certainty
  • separating lived experience from professional expertise
  • looking for independent criticism as well as praise
  • remembering that charisma and confidence are not evidence

If someone is selling certainty, urgency or fear, it is worth stepping back.

Final thoughts

Selling hope online is profitable because hope is human. But when certainty is promised, authority is exaggerated, and vulnerability is exploited, ethical lines are crossed.

From spells that guarantee love, to wellness figures implying medical authority, to self-styled experts inventing academic credibility, the pattern is the same. Trust is monetised before evidence is offered.

As AI makes fake expertise easier to create and harder to spot, learning to question claims, credentials and promises will become essential. Protecting your well-being and your money increasingly depends on understanding how hope is being sold, and knowing exactly what you are paying for and from whom.

Hope should support people, not exploit them.


Discover more from Healthy Vix

Subscribe to get the latest posts sent to your email.

Leave a Comment