top of page

Built for those who can't afford to get it wrong

Cothon lets international development organizations use LLMs like ChatGPT, Claude, and Gemini securely. Customizable guardrails safeguard privacy, prevent bias and hallucination, and ensure transparency.

Operationalizing AI policy for international development

Ethical rules at the foundation. Our protection layer in the middle. Trusted outputs at the top.

Cothon Workflow Graphic.png

ChatGPT, Claude and other LLMs weren’t built for high-accountability aid work

Why Cothon exists

Ad-hoc use lets unverifiable claims, hidden bias, and sensitive data  slip into reports and decisions. The result: constant rework and the ever present risk of breaching duty of care or losing trust with communities and funders.

93% of humanitarian workers use AI tools. Only 8% of their organizations have governance systems in place. 

Reputational Damage

LLMs hallucinate data and misquote sources. One bad statistic in a donor report can undo years of credibility.

Direct Harm to Vulnerable People

Generic LLMs lack contextual safeguards. In working with vulnerable communities, even small inaccuracies can cause real harm.

Missed Opportunity

Organizations aren't standing still - they're building siloed AI tools that don't map to workflows, while staff quietly use ChatGPT anyway. Policy stays on paper.

Cothon AI is a secure interface that safely connects your teams to any leading LLM 
 

Built for international development work, Cothon combines powerful AI models with configurable ethical guardrails, data transparency, and usage monitoring.
It’s the simplest way for teams to unlock AI’s potential while maintaining full confidence in its integrity.

Group 16 (2).png

Transparency and Traceability

Every prompt, response, and decision is visible and auditable, giving you accountability to stakeholders, donors, and the communities you serve.

Group 18 (3).png

Adaptable Guardrails

Customize safety protocols to match your organization’s values and risk profile — from privacy requirements to cultural sensitivity and bias mitigation.

Group 17 (2).png

Industry-Leading Models, Ethically Configured

Access the world’s most advanced LLMs, like ChatGPT, Claude, and Gemini, in a controlled environment purpose-built for responsible use.

Our team

The team behind Cothon has led climate finance, sensitive data systems, and crisis response programs across four continents.

Will Color Headshot_edited.jpg

Co-Founder, CPO

Will Culhane

Knows what to build and why it matters

Max Color Headshot_edited.jpg

Co-Founder, CEO

Max McGrath-Horn

Knows how to fund it, partner it, and scale it

Yann Color Headshot_edited.jpg

Founding Engineer

Yann Say-Liang-Fat

Knows how to build it and make it work

Be the first to experience safe, transparent LLM access for high-accountability organizations.
 

Cothon is in early access. Request an invitation and we'll be in touch.

Get involved

Reason for outreach
Request early access
Interested in partnering
Sector

Please select the sector that best represents your organization.


bottom of page