FCA consumer understanding: good practices and areas for improvement

The FCA’s latest report, published on 13 March 2026, provides valuable insight into how effectively firms are meeting the consumer understanding outcome of the Consumer Duty.

The report aims to help firms meet the requirements by clarifying the FCA’s expectations and providing examples of best practice vs areas of improvement. The FCA carried out a survey in September 2025 of 38 different sized firms across the FS sector. It considered the 2024 Financial Lives Survey, which found issues with people’s understanding of products, low confidence in everyday numeracy and preferred communication channels being withdrawn.

The findings apply to all regulated firms that provide products or services to customers, particularly those who communicate with them. The FCA expects firms to use the insights to assess their approach and identify improvements.

The FCA highlighted the following good practices and steps that firms can take.

Use customer insights

  • Analysing insights from multiple sources and reviewing evidence regularly to make meaningful improvements.

Test comms with real customers

  • Testing both before and after changes, using various tools and verifying improvements.

Communicate clearly, simply and accessibly

  • Using plain language, clear structure, visual hierarchy and layered content.
  • Supporting those who need alternative or accessible formats.

Design journeys and tools that support understanding

  • Incorporating calculators, videos, walkthroughs and summaries.
  • Testing these tools to make sure they genuinely help customers and are refined based on user evidence.

Support vulnerable customers

  • Identifying needs early, adapting communications and embedding vulnerability considerations into governance, training and testing.

Use clear, fair and balanced financial promotions

  • Keeping promotional content accessible and easy to understand.
  • Testing understanding of key messages, including eligibility criteria and limitations.

Clear governance and oversight

  • Defining governance with clear ownership of consumer understanding and oversight embedded into business processes.
  • Using comprehension driven Key Performance Indicators (KPIs), not simply compliance, sentiment or activity related metrics.

Detailed findings

The FCA focused on the support provided to customers across five areas, calling out examples from smaller firms specifically:

MI and testing

Good practiceAreas for improvement

Use of customer insights

  • Analysing insights from multiple sources, including call listening, complaints, chat transcripts, website analytics, drop-off data (customers who start but do not complete a process) and surveys.
  • Reviewing evidence regularly and prioritising meaningful improvements e.g. decreased number of clicks to get to an outcome and reduced calls to a helpline.

Test comms with real customers

  • Testing both before and after changes.
  • Using tools such as surveys, comprehension checks, A/B testing, customer callbacks and feedback from frontline interactions.
  • Verifying whether changes improve customer understanding and adapting based on the results.
  • Documenting what changed, why it changed and the impact of each change, to give a clear link between insight and action.
  • Improving journeys and tools, such as calculators, videos, walkthroughs or summary pages, then retesting to validate the impact.

Weak evidence of testing

  • Testing that is superficial, one-off or poorly documented.

Unclear use of MI and insight

  • Failing to use MI to assess consumer understanding and support decisions.
  • Failing to show why MI collected is relevant, how it is reviewed and how it helps identify issues.
  • Relying on sales data or the absence of complaints.

Insufficient testing with different customer groups

  • Not testing comms with people who have accessibility needs, language requirements or lower financial capability.
  • Failing to assess and design comms for the target market.

No follow-up to assess whether changes worked

  • Not checking if changes had worked and poor recordkeeping on what had been changed, why or the impact on customers.

 

Smaller firms – examples

A firm tested a renewal letter with a small sample of customers, including two with sight impairments, and found that the layout and headings were hard to follow. The firm introduced a large‑print design, a 100‑word summary and clearer next steps. After launch, quick follow‑up calls and a micro‑survey showed better understanding and fewer complaints.

Other examples were calling customers to check understanding, surveying users during digital trials and comparing outcomes before and after updates across different communication channels.

Innovation and communications design

Good practiceAreas for improvement

Design communications deliberately

  • Using clear structures, plain language and intuitive layouts.
  • Using layering to give information without overwhelming the customer. For example, by presenting key information upfront with detail layered beneath, signposting and drip-feeding messaging at the right time.
  • Simplifying digital journeys by presenting key messages earlier, improving visual hierarchy and testing design changes.

Simplification beyond reducing word count

  • Organising text in a way that customers understand, e.g. using short summaries, clear definitions, QR‑linked explanations, interactive FAQ dropdowns and clear signposting.

Cosmetic changes that do not address root causes

  • Making surface‑level changes, shorter wording, new icons or colour changes without improving clarity, sequencing or prominence of key information.

Little or no testing of new designs

  • Not testing with customers or relying on very small, unrepresentative groups.
  • Assuming an absence of complaints means communications are clear.

Failure to adapt communications for different customer needs

  • Presenting the same design and format to all customers without considering vulnerabilities, accessibility needs, language preferences or lower digital confidence.

Overlong documents with limited signposting

  • Continuing to produce long, dense documents without summaries, visual hierarchy or navigational cues.
  • Expecting customers to locate critical information without support.
  • Committing to using jargon-free and intelligible language without evidence of how this has been embedded in comms.

Smaller firms – example

The FCA referred to the use of clear, easy‑to‑read summary sheets that sit on top of lengthy T&Cs.

Prompts, tools and interactive formats

  • Using calculators, walkthroughs and reminders to help customers.
  • Using short videos, interactive diagrams and clickable FAQs.
  • Using real-time prompts to stop customers from making common mistakes such as alerts about potential scams or messages to explain why certain information is needed.
  • Using chatbots and virtual assistants to answer questions instantly or guide customers through a journey step-by-step.  

Designing for accessibility and diverse needs

  • Tailoring designs for customers with low digital confidence, lower financial capability, sensory impairments or different needs.
  • Adapting formats, improving readability and testing with users who face additional challenges e.g. improved colour contrast, alternative formats, meaningful alt‑text, text rewrites and layouts tested for screen readers.

Vulnerability and accessibility

Good practiceAreas for improvement

Proactive identification and tailored support

  • Taking care when communicating with vulnerable customers and considering their varied needs, accessibility and digital confidence.
  • Embedding vulnerability assessments at key decision points such as onboarding, renewal and arrears.  
  • Producing timestamped notes and flags visible for frontline staff to adjust communication accordingly.

Reactive rather than proactive

  • Having vague or underdeveloped policies and processes for the approach to identifying vulnerable consumers.

Reliance on general knowledge with no proactive mechanisms

  • ‘Considering vulnerability’ but failing to provide details on how this translates into practical changes or measurable outcomes.

Limited alternative formats and assisted channels

  • Failing to provide alternative formats (e.g., large print, audio, BSL) or assisted channels for customers with low digital capabilities.

One-off initiatives without governance

  • Making ad-hoc improvements but not embedding them into governance or control frameworks or integrating them into day-to-day operations for consistent and sustained application.

Initiatives without results or evidence of what changed

·       Carrying out testing or introducing new tools without showing outcomes for vulnerable customers.

  • Failing to provide comprehension measures, acceptability targets or scaling of initiatives beyond pilot schemes.
  • Lacking data, weakening auditability and making it difficult to evidence improvements.

Smaller firms - examples

The FCA said that a focus on characteristics, behaviours or communication patterns can help firms to identify the common risks to understanding within their customer base. Whilst firms are still expected to respond to individual needs, this could help them to concentrate resources to give clear and consistent support.

An example was a firm that built prompts into its fact-find/sales process, giving space for customers to disclose their circumstances. It took additional steps to flag suspected or confirmed vulnerabilities and held discussions in monthly meetings to share learnings, refine indicators and support responses.

 Accessible 'tell-us-once' systems

  • Recording centrally accessibility needs and preferences (such as communication format, language and large print).
  • Applying adjustments consistently across future interactions.

Smaller firms - example

The FCA gave an example of a firm that made a ‘tell us once’ update to its internal system. Staff were prompted to record customers’ communication preferences and any other reasonable adjustments.

The requirements were saved against the customer’s profile. Automation meant that customer requirements were pre-applied to all engagement. Call handlers had reminder pop-ups. Each adjustment showed when it was last reviewed to assist monitoring.

Smaller firms - examples

  • Not having clearly defined processes in place to identify or flag vulnerability.
  • Relying on having known customers for a longer period or staff having experience in customer service.

Testing with vulnerable cohorts

  • Regularly testing communications with groups of customers that are representative of the intended recipients.  
  • Testing communications with consumers who have characteristics of vulnerability e.g. testing all new product communication with a sample of vulnerable customers, capturing key insights including comprehension scores and measuring these against an internal target for correct recall of key points.
  • Simplifying comms by reordering content to increase visibility of important information, reduce volume and add plain language explanations for complex terms.  
  • Reviewing live chat transcripts regularly to identify unclear wording and vulnerability cues.

Smaller firms - example

The FCA highlighted that firms must do testing that is proportionate to their size and market presence. The FCA gave an example of a firm that prioritised two high impact communications. The firm ran simple tests fortnightly with 3-5 customers drawn from recent contact. It focused on consumers with lower digital confidence and English as a second language. Testing was mostly phone-based with staff asking customers to read back key elements of their product with policy documents in hand. Findings were then used to refine comms, resulting in fewer repeat contacts from customers.

Monitoring outcomes and acting on insights  

  • Monitoring outcomes for vulnerable customers as a distinct MI category.
  • Regularly reviewing call and chat transcripts, top complaint reasons and repeated customer contact, and presenting results at forums where changes are agreed.
  • Monitoring drop-off rates and running A/B tests.

Financial promotions

Good practiceAreas for improvement

Accessibility and fair messaging

  • Avoiding jargon, using clear summaries and making sure core messages remain visible across formats such as mobile, email or social media.
  • Carefully designing digital and printed layouts so that important information does not get lost or obscured, even when content is resized or reformatted on different devices.

Monitoring effectiveness and acting on insight

  • Looking at customer questions, complaints and behaviour to spot where promotions might confuse or mislead. Then using this evidence to rewrite, clarify and highlight key information.
  • Testing promotions to confirm understanding e.g. A/B testing, short surveys and asking customers to explain the main message in their own words.

Clear approval and governance processes

  • Using evidence-based approval steps, e.g. readability checks, approved templates, rules and a documented sign-off process.
  • Testing promotions before release and keeping records of how wording and decisions were agreed.

Overemphasis on benefits

  • Highlighting offers upfront while hiding limits or risks.
  • Advertising headline rates without making it clear they are subject to further checks.

Limited or no consumer testing

  • Relying on compliance checks or sales data, without checking that real customers understand.

Inadequate monitoring of outcomes

  • Approving promotions without monitoring understanding, drop-off points or complaints.

Unclear, inaccessible or unbalanced messaging

  • Using dense language, cluttered layouts and inconsistent formatting.
  • Placing risk warnings at the end of long mobile journeys or in small or low contrast formats.
  • Using attention-grabbing adverts to make the product look like an exclusive benefit, misleading customers and encouraging clicks.

 

 

Smaller firms – example

The FCA gave an example of a firm that had a straightforward approval process, supported by short checklists and clear sign-off. Staff paid close attention to customers’ questions and complaints, carrying out basic testing by asking a few customers or colleagues to read the promotion and explain the message back. Decisions were recorded consistently without needing complex systems.

Governance and oversight

Good practiceAreas for improvement

Clear senior responsibility

  • Assigning explicit responsibility for consumer understanding to the relevant level of seniority.
  • Responsible individuals monitoring MI and making decisions based on data.
  • Having an adaptable approach with consumer understanding at the forefront.
  • Keeping clear records of decision making, regularly reviewing customer feedback and following this up through agreed actions.
  • Keeping communication standards consistent, embedded across teams and maintained.

Effective governance structures

  • Bringing together the right functions including product, operations, customer service, risk and compliance to review insights, escalate risks and track outcomes across all products and channels.
  • Having clear terms of reference, structured agendas and well documented outputs/follow-up items.
  • Having designated forums for the Duty who met regularly to review evidence, track KPIs, and make sure issues are escalated promptly.

Embedding consumer understanding into everyday processes

  • Embedding consumer understanding into operations rather than treating it as a standalone compliance task.
  • Training staff to help customers understand information, such as running short surveys or reviewing call transcripts.
  • Simplifying escalation processes.

Unclear accountability

  • Lacking clear responsibility for the consumer understanding outcome or related decisions.
  • Making decisions without using meaningful data or evidence.

Lack of checks for different types of customers

  • Only providing senior decision makers with overall data that does not check or monitor results for different types of customers.

Weak feedback loop

  • Failing to act upon monitoring results to feed information back into governance or use it to improve comms.

Limited use of MI in decision-making

  • Governance committees receiving only compliance, sentiment or activity metrics without comprehension driven KPIs.

 

 

Smaller firms

Good governance can be demonstrated by keeping clear notes of decisions made, checking regularly that decisions are still working and using straightforward logs to track actions. For example, putting a single senior compliance lead in charge of consumer-understanding work.

It is clear that meeting the “clear, fair and not misleading” rule is simply not enough. The publication of the report can be viewed as a good prompt to take stock and consider any improvements that can be made. There is nothing new here, but until firms get the basics right to aid consumer understanding, the FCA will continue to do work in this area.

Get in touch 

To speak to a member of our Finance team and know more about how your firm can apply FCA’s insights, get in touch via the form below.

Contact us

Key contacts