Insights & optimisations in data-heavy systems

Building an Automated SEC Securities Lending Reporting System from Scratch

Financial Institution | Regulatory Compliance | C#/.NET & SQL Server

The Challenge

A financial institution needed to comply with new SEC (Securities and Exchange Commission) securities lending reporting requirements mandated by FINRA (Financial Industry Regulatory Authority). The institution was already processing Securities Financing Transactions Regulation (SFTR) reports but faced several critical challenges:

  • No existing system for SEC-specific reporting workflows
  • Complex regulatory requirements with multiple report types (New, Modification, Termination, Cancellation, Correction, Pre-Existing, Delete)
  • Tight compliance deadlines requiring reports to be submitted within T+1 (next business day by 01:00 AM)
  • Need to process both current and historical transactions differently based on FINRA start dates
  • Complex business logic requiring 20+ different reporting scenarios based on transaction lifecycle and state changes
  • Integration with existing SFTR processing without disrupting current operations
  • File generation and automated transmission to FINRA via SFTP in JSON format with BZip2 compression
  • High data volumes requiring efficient batch processing (5,000-10,000 transactions daily)

The institution had existing SFTR reporting infrastructure but needed a completely new parallel system for SEC reporting that could leverage shared data while maintaining independence.

Our Approach

We designed and implemented a comprehensive SEC reporting automation system integrated with the existing SFTR infrastructure. The solution was built incrementally over several months with a focus on reliability, compliance, and maintainability.

Phase 1: Database Architecture & Data Model

  • Designed normalized database schema with dedicated SEC tables (SECTradeData, SECEventProcess, SECEventReport)
  • Created relationship links to existing SFTR enrichment data to avoid duplication
  • Implemented temporal tracking for transaction lifecycle states
  • Built stored procedures using table-valued parameters for efficient batch processing
  • Established proper indexing strategy for high-volume query performance

Phase 2: Core Processing Engine

  • Developed C# object model mapping to database schema using Dapper ORM
  • Implemented event-driven architecture for processing securities lending transactions
  • Created comprehensive reporting logic engine handling 20+ distinct scenarios (new loans, pre-existing modifications, terminations, counterparty changes)
  • Built MD5 hash comparison for detecting actual data changes vs. redundant updates
  • Integrated collateral and margin processing logic

Phase 3: Intelligent Report Type Determination

  • Developed sophisticated decision engine mapping SFTR action types to FINRA report types
  • Implemented business rule overrides based on transaction characteristics (pre/post-FINRA start date, counterparty changes, termination logic)
  • Created validation framework ensuring report accuracy before submission

Phase 4: JSON File Generation & Transmission

  • Built comprehensive data mapping service transforming database records to FINRA JSON format
  • Implemented multiple report type models (SECNewEvent, SECPreExistingEvent, SECTerminationEvent, etc.)
  • Created file generation service with proper FINRA naming conventions
  • Integrated BZip2 compression for file size optimization
  • Developed SFTP transmission service with error handling and retry logic

Phase 5: Batch Processing & Scheduling

  • Designed processing schedule aligned with trading hours: 13:30 PM and 16:30 PM batches, plus 23:00 PM final sweep
  • Built status tracking through multiple states (Pending Queue, File Sent, Acknowledged, Rejected)
  • Created comprehensive logging and monitoring for compliance auditing

Phase 6: Testing & Quality Assurance

  • Developed extensive unit test suite using xUnit framework (881 automated tests)
  • Created integration tests validating end-to-end workflows
  • Built test data factories for reproducible scenarios
  • Implemented parallel test execution for rapid feedback
  • Established CI/CD pipeline with automated testing in Azure DevOps

The Results

100%
On-Time Compliance
~40 hrs
Manual Work Eliminated Weekly
<5 min
Average Batch Processing Time
10,000
Daily Transactions Processed
99.9%
System Uptime
Zero
Compliance Violations

Successful Regulatory Compliance

  • 100% on-time submission rate meeting T+1 regulatory deadlines
  • Zero compliance violations since go-live
  • Complete audit trail for regulatory scrutiny
  • Automated generation of all required report types

Operational Efficiency

  • Reduced manual processing from ~40 hours/week to zero
  • Eliminated human error in report generation and data mapping
  • Automated file transmission replaced manual SFTP uploads
  • Real-time status tracking provides visibility into processing pipeline

Business Value

  • Avoided regulatory penalties (potential millions in fines)
  • Eliminated need for additional manual processing headcount
  • Reduced operational risk through automation
  • Positioned institution for future regulatory changes with flexible architecture

Technologies Used

C# .NET 6.0 SQL Server 2019 Dapper ORM xUnit Testing Azure Cloud BZip2 Compression SFTP JSON Hangfire Azure DevOps

This case study demonstrates the complexity of implementing regulatory compliance systems in financial services, requiring not just technical expertise but deep understanding of financial regulatory requirements, domain knowledge, integration patterns, performance engineering, and testing rigor.

Reducing SFTR Report Generation Time from 15 Minutes to Under 2 Minutes

Financial Institution | Regulatory Reporting | SQL Optimisation

The Challenge

A mid-sized financial institution was struggling with their Securities Financing Transactions Regulation (SFTR) reporting process. Their daily regulatory report, which aggregated complex transaction data from multiple systems, was taking over 15 minutes to generate. This created several critical problems:

  • The reporting window was tight, leaving no buffer for retries if issues arose
  • Database locks during report generation impacted other business operations
  • The operations team had to monitor the process manually, consuming valuable time
  • Any data corrections required a full re-run, often pushing into critical business hours

The complexity of SFTR reporting—requiring aggregation across transaction types, counterparties, and collateral valuations—made the process inherently resource-intensive. The existing stored procedures had evolved organically over time, accumulating technical debt as new requirements were added.

Our Approach

We conducted a comprehensive analysis of the reporting pipeline, focusing on three key areas: query execution plans, database schema design, and data access patterns.

The investigation revealed several critical bottlenecks. First, the main aggregation queries were performing multiple table scans across transaction tables containing millions of rows. Second, temporary staging tables were being rebuilt from scratch for each report run, despite much of the underlying data remaining unchanged. Third, complex business logic was embedded directly in SQL, creating deeply nested subqueries that the query optimiser struggled to handle efficiently.

Our optimisation strategy included:

  • Redesigning the core aggregation logic to leverage indexed views for frequently accessed data combinations
  • Implementing an incremental processing approach, only recalculating data that had changed since the previous run
  • Refactoring complex nested queries into a series of well-indexed temporary tables with strategic use of filtered indexes
  • Moving certain business logic calculations into materialised columns with persisted computed values
  • Optimising the transaction data schema by adding carefully designed composite indexes aligned with common query patterns
  • Implementing parallel processing for independent data segments where appropriate

Throughout the project, we worked closely with the institution's operations and compliance teams to ensure the optimised process maintained complete data accuracy and met all regulatory requirements. Extensive reconciliation testing verified that the new process produced identical results to the original implementation.

The Results

87%
Reduction in Processing Time
<2 min
Report Generation Time
70%
Lower CPU Utilisation
Zero
Production Impact Issues

The optimised SFTR reporting process now completes in under 2 minutes, an 87% reduction from the original 15+ minute runtime. This dramatic improvement delivered immediate operational benefits. The reporting team gained flexibility in scheduling, with ample buffer time for reruns if needed. Database resource consumption during reporting dropped by 70%, eliminating the impact on concurrent operations. The faster turnaround enabled the compliance team to review and submit reports earlier in the day, reducing regulatory submission risk.

Beyond the immediate time savings, the refactored architecture positioned the institution for future growth. The modular design made it straightforward to add new SFTR requirements as regulations evolved, and the improved performance meant the system could handle significantly higher transaction volumes without degradation.

The operations team reported a significant reduction in monitoring overhead, as the reliable sub-two-minute processing time eliminated the need for constant supervision. The compliance team gained confidence in the consistency and accuracy of the automated process.

Technologies Used

SQL Server 2019 T-SQL Indexed Views Filtered Indexes Query Performance Tuning C# Integration

This case study demonstrates typical results from our SQL Server performance optimisation engagements. While specific technologies and approaches vary by project, the methodology remains consistent: thorough analysis, strategic optimisation, and validation against business requirements.

Have a Similar Challenge?

Let's discuss how we can optimise your systems and automate your workflows.

Book Your Free Discovery Call