About

We're building the
fastest path to Solana state.

Catscope started with a simple observation: on Solana, the trader who sees state change first wins. So we built the infrastructure that lets you see it first — from inside the validator itself.

What we believe

Four things we won't compromise on.

Speed

Latency is the product.

Every architectural decision starts with one question: does this get us closer to the next block, or further away?

Trust

Transparent or it doesn't ship.

Auction prices, validator reputation, and policy rules are all on-chain. No backroom deals. No hidden middleware.

Open

Aligned with Solana, not parasitic to it.

We don't fork the validator. We don't gate Solana state. We make the existing infrastructure more valuable for everyone on it.

Edge

Better tech wins.

In a market where microseconds define advantage, we believe the right answer is simply being faster, more accurate, and easier to integrate than every other option.

Mission

A future where blockchain is the default backbone.

We believe in a future where blockchain is not a sidebar — it's the substrate. Real-time, decentralized, programmable infrastructure beats every centralized alternative if the latency is right. Our job is to make sure the latency is right.

The team

Small team. High signal.

We've shipped peer-to-peer infrastructure before. We're shipping it again — this time for Solana.

JDJ

Joel De Jesus

Co-founder · Engineering

Deep peer-to-peer background — Bitcoin, TOR, decentralized identity. Built BTC-based identity and asset management for businesses in Japan. Now leads Catscope's runtime and marketplace protocol.

LinkedIn →
NH

Naomi Honda

Co-founder · BD & Growth

Leads partnerships with validators and trading firms across the Solana ecosystem. Background in sociology and decentralized infrastructure. Drives Catscope's adoption and validator network growth.

LinkedIn →

Get in touch

We're hiring. We're partnering. We're listening.

If you run validator infrastructure, run trading capital, or want to work on protocol-level latency problems — we want to talk.