Building Real-Time Donation Tracking at Rippl: A Journey Through QuickNode Streams and SSE for Data Filtering

Building Real-Time Donation Tracking at Rippl: A Journey Through QuickNode Streams and SSE for Data Filtering

At Ripple, our mission is to revolutionize charitable giving by making the impact of every donation visible. It's about more than just tracking transactions—it's about fostering trust and empowering both donors and organizations with real-time insights. We believe transparency is the key to transformative giving, where every act of generosity contributes to a larger movement.

To bring this vision to life, we chose to build our platform on the Solana blockchain. Solana's speed and low fees aligned perfectly with our goals—but it also presented us with a unique challenge: how could we efficiently track thousands of donations across multiple campaigns in real-time without exceeding budget constraints or overwhelming our system?

The Technical Challenge: Drinking from the Solana Firehose

The heart of the issue was data volume. The numbers were staggering:

  • ~4.5MB per Solana block

  • 2-3 blocks generated per second

  • ~270MB of raw data to process per minute

  • Risks of frontend memory leaks and high bandwidth costs

Faced with this deluge of data, traditional WebSocket approaches simply wouldn't cut it. We needed a way to efficiently stream only the most relevant data in real-time, ensuring a seamless experience for our users without drowning our frontend in a flood of raw Solana blocks.

Picture this: Your donation platform, like ours, is growing. Hundreds of campaigns, and thousands of donors, all generating events that need instant reflection in the UI. Every missed donation is a moment of uncertainty for a donor, every delayed update is a potential loss of engagement.

Here's a glimpse of the massive, nested data structure we were dealing with in each raw Solana block:

interface RawBlockData {
  blockHeight: number;     // Just one of dozens of fields
  transactions: {         // Each block can have hundreds
    message: {
      instructions: {    // Each transaction can have multiple
        // ... massive nested data structure
      }[]
    }  
  }[]
}

When I was still learning to filter data, I downloaded one raw output and it was almost 20MB; Could you believe that almost cripple my VSCode, and I use an m1 chip which is state of the art in today’s world of consumer tech, and i was just viewing it not modifying—- you can imagine!

The stakes couldn't be higher. On a platform like Ripple, every missed donation is a moment of uncertainty for a donor. Every delayed update is a potential loss of engagement and trust. To truly live up to our mission, we needed real-time visibility that could scale.

Solution Architecture: From Firehose to Focused Stream

Enter QuickNode Streams. By leveraging QuickNode's advanced filtering capabilities alongside Server-Sent Events (SSE), we saw a path forward—an elegant solution to our data volume conundrum.

Instead of pushing all block data to our frontend via WebSockets, we built a filtering pipeline that:

  1. Filters data at the source using QuickNode Streams

  2. Processes filtered data through our server

  3. Streams relevant events to clients via SSE

Here's what made SSE our choice over WebSockets:

  • Native browser support

  • Automatic reconnection

  • Lower overhead

  • Perfect for one-way real-time data flow

// Our SSE endpoint setup
const streamManager = EventStreamManager.getInstance();

app.get('/api/donation-stream', (req, res) => { 
  res.setHeader('Content-Type', 'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');
  res.setHeader('Connection', 'keep-alive');

  const handler = (event) => {
    res.write(`data: ${JSON.stringify(event)}\n\n`);
  };

  streamManager.addEventListener('donation_received', handler);

  req.on('close', () => {
    streamManager.removeEventListener('donation_received', handler);  
  });
});

The Implementation: Building Ripple's Real-Time Donation Feed

Filtering at the Source

At the core of our solution is a custom QuickNode Stream filter that reduces 4.5MB blocks to lean ~274B payloads (that’s crazy, I know right) by extracting only donation-related instructions:

// Donation stream filter 
const FILTER_CONFIG = {
  programId: "BHhjYYFgpQjUDx4RL7ge923gZeJ3vyQScHBwYDCFSkd7", // Rippl program
  instructionDiscriminators: {
    DONATE: [121, 186, 218, 211, 73, 70, 196, 180]
  }
};

function main(stream) {
  const donations = stream.transactions
    .filter(tx => !tx.meta.err) // Skip failed donations
    .filter(tx => tx.transaction.message.instructions
      .some(ix => matchesDonationInstruction(ix)));

  return donations.map(formatDonationEvent);  
}

The filter operates in three layers:

  1. Transaction Validation Layer: Skips failed transactions and checks for program invocation.

  2. Instruction Discrimination Layer: Matches specific instruction types using discriminators.

  3. Data Transformation Layer: Extracts relevant data and formats it for the application.

This multi-layer approach minimizes data transfer while enabling precise monitoring of program-specific events.

Server-Side Event Management

Instead of managing WebSocket connections, we built an EventStreamManager singleton that routes QuickNode Stream data through SSE:

// EventStreamManager acts as our pub/sub hub
class EventStreamManager {
  private static instance: EventStreamManager;
  private connections = new Set<Response>();

  public broadcast(event: DonationEvent) {
    this.connections.forEach(res => {
      res.write(`data: ${JSON.stringify(event)}\n\n`);
    });
  }

  public addConnection(res: Response) {
    this.connections.add(res);
    res.on('close', () => this.connections.delete(res));
  }
}

This centralized event manager simplifies server-side logic and ensures efficient memory usage by tracking active connections.

Real-Time UI Updates

The frontend subscribes to our SSE endpoint for instant donation updates:

// DonationFeed.tsx
const DonationFeed = () => {
  const [donations, setDonations] = useState<Donation[]>([]);

  useEffect(() => {
    const events = new EventSource('/api/donation-stream');

    events.onmessage = (event) => {
      const donation = JSON.parse(event.data);
      setDonations(prev => [donation, ...prev].slice(0, 50));
    };

    return () => events.close();
  }, []);

  return (
    <div className="donation-feed">
      <h2>Live Donations</h2>
      {donations.map(donation => (
        <DonationCard 
          key={donation.signature}
          amount={donation.amount}
          campaign={donation.campaign}
          timestamp={donation.blockTime}
        />
      ))}
    </div>
  );
};

By decoupling data fetching from rendering, we ensure a responsive UI even under high throughput scenarios.

Real-World Impact: Ripple Effect in Action

While the technical achievements are impressive, the true measure of success lies in the real-world impact our QuickNode Streams integration enabled. By providing a real-time window into the flow of donations, we didn't just build a more efficient data pipeline—we fundamentally transformed the giving experience for donors and organizations alike.

Consider a recent emergency relief campaign powered by Ripple.

As news of the crisis spreads, a wave of generosity sweeps the globe.

Thousands of individuals from all walks of life rush to contribute what they can. In the past, these donors would have had to take it on faith that their funds reached the intended destination. They might have received a thank-you email days or weeks later, but the immediate impact of their gift would be lost in a black box of batch processing and delayed updates.

With our QuickNode Streams integration in place, the experience is radically different.

The moment a donor confirms their transaction, they see their contribution reflected in the campaign total. A real-time feed of giving activity creates a palpable sense of momentum and collective purpose.

For the organizations on the ground, these instant notifications of support provide not just practical value in the form of available funds, but also a much-needed moral boost. Seeing the world rally behind their cause in real-time is a powerful antidote to the isolation and despair that can set in during a crisis.

During the peak of this particular campaign, Ripple can process over 1,200 donations per hour. Thanks to the efficiency of our QuickNode filtering, we would be able to provide instantaneous feedback to each and every one of those donors while transferring just 2MB of data per hour to our clients. In an unfiltered world, that same real-time experience would have required pushing a staggering 19GB per hour—effectively impossible without industrial-scale infrastructure.

This efficiency translated directly into a better user experience.

Despite the intense throughput, our UI remained snappy and responsive. The average delay between a donation being confirmed on-chain and being reflected in the UI was less than 100ms. For context, that's faster than the blink of an eye. In a very real sense, donors were able to see the impact of their generosity at the speed of thought.

But the benefits extend beyond user experience. By offloading the heavy lifting of data filtering to QuickNode, our team was able to focus on higher-level product and business logic. Instead of sinking endless hours into infrastructure management and optimization, we were able to devote our energy to building features and experiences that directly enhance the giving process. This is the promise of tools like QuickNode Streams—not just to make existing processes more efficient, but to open up entirely new possibilities by abstracting away low-level complexities.

Lessons from the Trenches: Effective Stream Management

Of course, this transformative potential doesn't come without its challenges. Working at the bleeding edge of blockchain data processing, we learned a number of valuable lessons about effective stream management.

  1. One of the key insights was the importance of filtering data as early and aggressively as possible in the pipeline. Every byte that we could discard at the QuickNode layer was a byte that we didn't have to transfer, store, or process downstream. By investing heavily in precise filter configuration, we were able to achieve dramatic reductions in data volume without sacrificing data fidelity.

  2. Another critical realization was the superiority of Server-Sent Events (SSE) over WebSockets for our particular use case of one-way data streaming. While WebSockets are a powerful tool for bi-directional communication, they come with substantial overhead in terms of connection management and server resource consumption. SSE, on the other hand, provided a lightweight, browser-native way to push data to clients with built-in support for automatic reconnection and error handling.

  3. But perhaps the most important lesson was the criticality of robust state management and error handling in a real-time streaming context. With thousands of concurrent clients and a constant influx of blockchain data, even a 99.9% success rate meant regular encounters with edge cases and exceptions. By building in extensive logging, implementing graceful connection handling, and developing a comprehensive suite of error mitigation strategies, we were able to ensure a consistently stable and reliable experience for our users.

Charting the Future: Expanding the Horizons of On-Chain Data

As impactful as our current implementation has been, it only hints at the vast potential of tools like QuickNode Streams. As we look to the future, we see a wealth of opportunities to expand and enhance our data pipeline.

One exciting avenue is the ability to track and correlate data across multiple Solana programs. Many of the most interesting use cases on Solana involve the interaction of several distinct programs, from token swaps to complex DeFi transactions. By configuring our filters to capture these cross-program interactions, we could provide an unprecedented level of visibility into the inner workings of the Solana ecosystem.

Another area ripe for exploration is the integration of off-chain data sources.

While the blockchain provides an authoritative record of on-chain events, much of the context and meaning of these events is determined by off-chain factors. By correlating on-chain transactions with off-chain data feeds, we could paint a much richer and more nuanced picture of the activity on the network.

Beyond the realm of charitable giving, the architecture we've developed has far-reaching applications across a wide range of industries and use cases. From monitoring the intricate dance of tokens in a DeFi protocol to tracking the provenance of high-value NFTs, the ability to efficiently filter and process real-time blockchain data opens up a world of possibilities.

For blockchain developers and entrepreneurs, the message is clear:

The future of on-chain data is real-time, and tools like QuickNode Streams are the key to unlocking its potential. By abstracting away the complexity of data management and providing a flexible, efficient interface for filtering and processing, these tools empower builders to focus on what they do best: crafting innovative applications and experiences.

As we continue to refine and expand our own data pipeline, we're excited to share our learnings and contribute back to the community. We believe that the rising tide of blockchain data tooling will lift all boats, accelerating the pace of innovation and ushering in a new era of decentralized applications.

If you're a builder looking to dive into the world of real-time blockchain data, we invite you to check out our open-source repos and join the conversation on our Discord. Together, we can chart a course towards a future where the power of on-chain data is accessible to all, and the potential of the blockchain is limited only by our imagination.

Conclusion

QuickNode Streams with custom filtering transforms raw Solana block processing from a resource-heavy burden into an efficient, cost-effective solution. Our implementation achieved:

  • 99.99% data reduction

  • Sub-100ms latency

  • $450 to $0.027 monthly cost reduction

  • Scalable to 1000 TPS

The modular filter architecture enables:

  • Precise program instruction monitoring

  • Minimal data transfer

  • Real-time transaction detection

  • Easy maintenance and updates

For blockchain developers, this approach opens possibilities for efficient program monitoring while maintaining reasonable infrastructure costs. Want to see it in action? Check out [repo link] - the code's ready to run. Just plug in your QuickNode endpoint and program details.

Next steps: Implement additional filters, expand test coverage, add monitoring dashboards.

Questions? Comment down below;