In 2020, when the world locked down, Rose Wanyabiti made a choice. She would not wait for the economy to recover—she would take matters into her own hands with a skill that matched her vision. After years of working in law firms and NGOs, she taught herself to make shoes. Not just any shoes, but handcrafted leather footwear. She named her business Raaro Leather, and by 2022, it was official: she was an entrepreneur.
Making shoes by hand was a tedious and time-consuming process. But Wanyabiti had a vision: machines and industrial equipment that could scale her production, meet rising demand, and turn her small workshop in Bungoma into a thriving enterprise. But when she was ready to scale, banks shut the door on her dreams.
Rose walked into every bank she could find in Bungoma, armed with her business plan, her track record, and her ambition. One by one, they turned her away. Despite evidence of consistent trade, the formal financial system saw her as a risk rather than an opportunity.
“I walked to all local banks in town but never acquired a loan,” she says, the frustration still fresh.
In the end, her family and friends stepped in and loaned her the money, which she is still repaying.
“Were it not for my family and friends, I would not be here,” she says.

In the same town, Robai Wawire faced the same fate, despite having been in business for 35 years. She started with clothes, then maize and groceries, before settling into the cereals and groceries trade. That longevity has not translated into access to credit. Like Rose, Robai has found it nearly impossible to secure a loan as an individual. Her only business loan came through a women’s group.
“I once got a loan through a women’s chama, but it failed due to the higher interest rates and some members not remitting the loan repayments in time,” she explains.
Today, she relies on mobile money for transactions and mobile loans to boost her business, though her loan limit is low. “The highest limit I get is Ksh1,500.”
Women like Rose and Robai are economically active but invisible to the algorithmic systems used to decide who deserves credit.
Invisible to the machine
As digital tools increasingly shape entrepreneurship, exclusion often begins long before a loan application is submitted. Modern credit systems depend on machine-readable data, yet much of women’s economic activity falls outside what algorithms are designed to recognise.
In Kenya, women own about a third (32 per cent) of the licensed micro, small, and medium enterprises (MSMEs) in the country, according to data from the Kenya National Bureau of Statistics. While men dominate licensed businesses, women account for more than 60 per cent of unlicensed sole proprietorships; their business activity in the informal sector often goes unrecorded by formal financial systems and falls outside credit-scoring models. As a result, they lack access to capital and frequently cite this as their primary reason for failure. In effect, credit-scoring systems are not neutral gatekeepers; they choke the engine of Kenya’s economy and constrain, rather than enable, economic growth.
The digital divide deepens this exclusion. The 2024 Mobile Gender Gap Report by GSMA shows that women are eight per cent less likely than men to own mobile phones, 13 per cent less likely to have a smartphone, and 15 per cent less likely to use mobile internet. These are the tools that underpin digital transactions and algorithmic credit assessment.
Moreover, the 2024 FinAccess Household Survey shows that 47 per cent of women use formal banking products, compared to 59 per cent of men. Modern credit systems increasingly rely on digital data to assess risk, yet many women’s economic activities remain largely invisible to these data-driven financial systems. In an era where algorithms calculate creditworthiness, this data gap becomes a financial sentence: no data means no loan.
Patterns that disadvantage women
“The system is designed for a different kind of user,” explains Lukania Makunda, the market information lead at Financial Sector Deepening (FSD) Kenya. “Women’s transactions are more likely to happen in cash or within chamas. This means they don’t generate the digital footprint that financial algorithms rely on.”
This lack of digital visibility is not interpreted as neutrality. Instead, it is read as risk. Sharon Juma, a data analyst at FSD, explains that automated lending systems reward speed, frequency and volume in digital borrowing behaviour, patterns that often disadvantage women.
“It’s not just about access; it’s about design. Women are more risk-aware. They tend to be more cautious. They want to understand the loan terms before they click ‘accept’. But in the fast-paced world of digital lending, hesitation or low borrowing frequency is often misclassified as inactivity or unreliability.”
Swapneel Mehta, an AI ethics and bias researcher, says this misreading is structural.
“No phone means no data, and no data means no loan,” he explains. “The problem isn’t that the machine is evil—it’s that the machine is a mirror.”
When algorithms are trained on financial histories dominated by men — land ownership, formal credit, asset control — they learn to associate creditworthiness with male patterns. Mehta points to the Apple Card case, where an algorithm assigned higher credit limits to men than women with shared assets.
“If one of the world’s most sophisticated companies couldn’t stop its algorithm from discriminating, what happens when we deploy similar ‘black box’ models in unregulated markets with even less oversight?” He poses.
“The model doesn’t see gender explicitly. It sees past success. And because men were historically given capital, the algorithm concludes they are safer borrowers,” Mehta adds.
Bias by design
The bias embedded in AI-driven lending systems is no longer anecdotal; it is measurable. In 2025, an independent audit conducted by a consortium of African fintech researchers and digital rights organisations examined 10 major credit scoring algorithms currently used in Kenya, Nigeria, and South Africa. The study Double Discrimination: Algorithmic Amplification of Gender Bias in African Fintech Credit Scoring set out to test whether automated lending systems treat male- and female-led businesses equally.
To do this, the researchers used synthetic profiles, simulated loan applications with identical business revenues, transaction histories, and credit records, differing only in gender markers such as names and ownership indicators. The results were striking. Across all three countries, women-led small and medium-sized enterprises were consistently flagged as higher risk, receiving lower credit scores or outright rejections despite having the same financial profiles as their male counterparts.
On average, the audit found that women-led SMEs faced a 37 per cent underfunding penalty, with algorithms frequently citing “sector risk” or “network quality” as justification—categories that disproportionately disadvantage women operating in informal or community-based markets.
For decades, access to credit in Kenya has been shaped by collateral-based lending models that prioritise land ownership and formal assets. In AI-driven financial systems, these historical preferences have not disappeared; they have been encoded. Rather than assessing risk afresh, algorithms are trained on data that reflects decades of gendered exclusion.
“Kenyan banks have long preferred the safety of government bonds over the risk of lending to private businesses,” says Alfred Oduor, a tax and financial consultant. “When they do lend, they rely on historical data that favours men because men have held capital for decades. AI is simply automating this historical laziness.”
Julians Amboko, a financial policy expert, agrees, arguing that modern credit-scoring systems are not neutral innovations but extensions of old financial logic.
“Women were historically barred from inheriting land. When an AI model is trained on that history, it learns that safe borrowers are those with assets that women were systematically denied. It replicates the exclusion with mathematical precision.”
This is why experts argue that the solution does not lie in pushing women to “formalise” or digitise faster, but in redesigning what algorithms value. Women entrepreneurs in Kenya already generate significant economic data through mobile money, consistent cash flow, and long-standing trading relationships.
“Regulators must push for credit scoring that prioritises cash flow over collateral. If a woman is moving millions of shillings through mobile money for her business, that is a stronger predictor of repayment than a title deed she doesn’t own,” Amboko says. “The problem is not a lack of data; it is that algorithms are trained to ignore it. What is treated as ‘informal’ is, in reality, missed intelligence. We need a policy that forces the algorithm to look at the activity, not just the asset.”
For years, the fight against women’s economic exclusion, particularly in rural and marginalised communities, has faced challenges including limited access to finance, structural and social norms, gender-blind investment, and weak policies.
At the policy level, Kenya has adopted the Women Entrepreneurs Finance Initiative (We-Fi Code), aimed at addressing the structural financing gap facing women-owned businesses, estimated at $42 billion across Africa. The code requires financial institutions to collect and analyse sex-disaggregated data and to reassess lending practices that exclude women by design.
“This is not just a commitment on paper,” says Lukania Makunda of FSD Kenya. “It requires senior leadership buy-in. We are asking institutions to look at their own data and see the missed opportunity. When CEOs realise women-owned businesses are a growth market and are profitable, fixing the algorithm becomes a business decision, not a charitable one.”
The We-Fi Code forces institutions to confront long-standing biases embedded in their systems. “It shifts the focus from the land title a woman doesn’t have to the business activity she does. That’s where real inclusion begins,” adds Makunda.
For the technology sector, accountability must also include honesty and transparency. AI ethics researcher Swapneel Mehta argues that financial algorithms should be accompanied by clear disclosures, often referred to as model cards, that explain what data the AI algorithm was trained on and what limitations they carry.
“If a model has never been trained on successful women entrepreneurs, regulators need to know that before it is deployed,” Mehta says. “We should not be using financial products without understanding their bias. Transparency is not optional when algorithms are making life-altering decisions.”
Ultimately, experts agree that as long as banks can generate easy profit from government securities and bonds, there will be little incentive to invest more in inclusive lending models. Without regulatory pressure to prioritise cash flow over collateral, AI will continue to reward historical privilege rather than real economic activity.
For Rose Wanyabiti and Robai Wawire, the debate over algorithms and credit scoring is something distant, even abstract. They do not speak the language of data or artificial intelligence, but they live with its consequences. After years of rejection, both women say they no longer see borrowing from financial institutions as an option.
“I don’t think I will ever go for a loan,” they say.
Today, Rose still makes shoes by hand in her workshop in Bungoma. The machines she once imagined—industrial cutters, polishers, and moulders that could scale her business—remain out of reach. Not because her business failed or because she lacked discipline or ambition, but because the systems designed to fund growth never learnt how to see her. As Kenya embraces artificial intelligence, the question is no longer whether the technology works but who it works for and who it leaves behind. Harnessing AI for women’s economic empowerment is not a matter of inclusion for its own sake; it is a test of whether innovation can correct inequality or whether it simply automates it.
The future of economic growth depends on AI systems that recognise women not as risky, but as entrepreneurs and economic drivers whose data, labour, and potential already exist, even if the algorithm refuses to acknowledge them.
This article was produced as part of the Gender+AI Reporting Fellowship, with support from the Africa Women’s Journalism Project(AWJP) in partnership with DW Akademie. The journalist used AI tools as research aids to review and summarise policy and research documents and extract statistics. All interviews, analysis, editorial decisions and final wording were done by the reporter, in line with the Lake Region Bulletin’s editorial standards.
