The SEC delayed Canary Capital’s application for a Litecoin ETF today, opening public comments over the proposal’s compliance with regulatory requirements. The price of LTC fell 5% after the announcement.
The public comment aspect doesn’t appear to signal the Commission’s intentions; this could be a standard delaying tactic. Nonetheless, the market immediately took it as a bearish signal.
However, the SEC instead decided to delay this application, including a request for public comments in its notice:
“The Commission seeks and encourages interested persons to provide comments on the proposed rule change. The Commission asks that commenters address the sufficiency of [whether] the proposal… is designed to prevent fraudulent and manipulative acts and practices or raises any new or novel concerns not previously contemplated by the Commission,” it read.
Litecoin’s price fell quickly after the Commission delayed this application, dropping 5% at its lowest point. Polymarket’s odds of a Litecoin ETF approval in Q2 2025 also plummeted, but the chances of a 2025 approval in general remained steady.
Odds of a Litecoin ETF in Q2 2025. Source: Polymarket
In other words, things could be a lot worse. James Seyffart, an ETF analyst who predicted the Litecoin delay, didn’t comment on the public comment aspect. It seems like a stretch to claim that the SEC is signaling its intent to refuse this or any other altcoin ETF proposal.
Still, the market can react harshly to such developments in the short term, and traders are repositioning their bets on the altcoin.
DeFi Development Corp. (formerly Janover Inc.) is trying to raise $1 billion by selling securities to buy Solana (SOL) over time.
Earlier today, a report from Coinbase claimed that the firm had already raised $42 million for SOL purchases with similar sales. Apparently, these operations were only the beginning of a much larger ambition.
DeFi Development Bets Hard on Solana
In a trend that the crypto community is calling “Solana MSTR,” corporate actors have been buying SOL tokens.
“[DeFi Development] has adopted a treasury policy under which the principal holding in its treasury reserve on the balance sheet will be allocated to digital assets, starting with Solana. The Board of Directors approved the Company’s new treasury policy on April 4, 2025, authorizing long-term accumulation of Solana,” the filing claims.
In addition to selling up to $1 billion in securities, DeFi Development plans to register up to 1,244,471 shares of common stock for potential resale by existing stockholders to use this liquidity to buy Solana.
Specific details about each offering will appear in a supplement provided at the time of sale.
Coinbase noticed DeFi Development’s Solana ambitions and described them in a report released earlier today. The report described the company’s efforts to raise $42 million in convertible notes, using those funds to build an SOL reserve.
The company recently changed its name from Janover, and it now trades on the Nasdaq under the symbol DFDV. DeFi Development also aims to operate one or more Solana validators, enabling it to stake its treasury assets, participate in securing the network, and earn rewards that can be reinvested.
Corporate Solana investment is tiny compared to Bitcoin, but DeFi Development may just be its first whale. MicroStrategy’s plan to become a massive BTC holder didn’t just change its own character; it also transformed Bitcoin.
Jay Clayton, Trump’s next pick for the SDNY’s US Attorney, originally filed the SEC’s lawsuit against Ripple. Clayton promised to end crypto crackdowns at the SDNY but personally started one of the most notorious incidents.
Trump is also planning to use a procedural loophole to avoid a messy confirmation process, which Senator Chuck Schumer swore to block. This incident raises questions about the quality of crypto’s new political allies.
He originally tapped Jay Clayton for this role in November, and he actually became Acting Attorney today. There’s just one concern — Jay Clayton initially filed the SEC’s action against Ripple.
The SEC vs Ripple case is considered a landmark action of the Gensler era, but Clayton actually initiated the suit. Clayton served as the SEC’s Chair from 2017 to 2020, and he resigned more than six months before his term limit.
He filed the SEC suit on December 22 and resigned the very next day in what the company called a “parting shot.”
A few years later, Clayton’s on the other side of government crypto crackdowns. When Trump first tapped him for the role last November, a spokesman claimed that the office would cease crypto enforcement actions.
In 2023, Clayton made televised interview appearances criticizing Gensler’s crackdowns, which infuriated Ripple CEO Brad Garlinghouse.
Watching this clip makes my blood boil.
The hypocrisy is shocking. @CNBC@SquawkCNBC should be calling him out for the bullshit.
(As a reminder, jay clayton brought the case against ripple, me and Chris Larsen. And left the building the next day).
Today, no representatives from Ripple commented on Clayton’s new role, but it is likely to ruffle feathers all the same. Specifically, the process to get a nominee confirmed by the Senate can be grueling.
According to local media, Trump named Clayton the Acting SDNY US Attorney, intending him to occupy the permanent role. Trump first nominated him last week, and Senate Minority Leader Chuck Schumer vowed to block his confirmation.
Schumer claimed Clayton “has no fidelity to the law.”
Regardless, Clayton doesn’t need a confirmation vote to become Acting US Attorney, and he probably won’t need one. If the Senate won’t confirm him in 120 days, judges in the SDNY can appoint him until a nominee gets confirmed.
Trump doesn’t actually need to nominate anyone else, and Clayton could serve a regular term.
This is a very illustrative example of how much political power crypto has gained. Jay Clayton, the man who literally initiated the Ripple suit, will work against future enforcement. And yet, this doesn’t seem like an unambiguous good.
How much can the industry truly rely on its former enemies? How many of crypto’s friends today would gladly join a crackdown tomorrow? These are just some of the concerns among the crypto community.
The metrics used to measure outcomes can be misleading when evaluating blockchain performance. As more blockchain networks emerge, the public needs clear, efficiency-focused metrics, rather than exaggerated claims, to differentiate between them.
In a conversation with BeInCrypto, Taraxa Co-Founder Steven Pu explained that it’s becoming increasingly difficult to compare blockchain performance accurately because many reported metrics rely on overly optimistic assumptions rather than evidence-based results. To combat this wave of misrepresentation, Pu proposes a new metric, which he calls TPS/$.
Why Does the Industry Lack Reliable Benchmarks?
The need for clear differentiation is growing with the increasing number of Layer-1 blockchain networks. As various developers promote the speed and efficiency of their blockchains, relying on metrics that distinguish their performance becomes indispensable.
However, the industry still lacks reliable benchmarks for real-world efficiency, instead relying on sporadic sentimental waves of hype-driven popularity. According to Pu, misleading performance figures currently saturate the market, obscuring true capabilities.
“It’s easy for opportunists to take advantage by driving up over-simplified and exaggerated narratives to profit themselves. Every single conceivable technical concept and metric has at one time or another been used to hype up many projects that don’t really deserve them: TPS, finality latency, modularity, network node count, execution speed, parallelization, bandwidth utilization, EVM-compatibility, EVM-incompatibility, etc.,” Pu told BeInCrypto.
Pu focused on how some projects exploit TPS metrics, using them as marketing tactics to make blockchain performance sound more appealing than it might be under real-world conditions.
Examining the Misleading Nature of TPS
Transactions per second, more commonly known as TPS, is a metric that refers to the average or sustained number of transactions that a blockchain network can process and finalize per second under normal operating conditions.
However, it often misleadingly hypes projects, offering a skewed view of overall performance.
“Decentralized networks are complex systems that need to be considered as a whole, and in the context of their use cases. But the market has this horrible habit of over-simplifying and over-selling one specific metric or aspect of a project, while ignoring the whole. Perhaps a highly centralized, high-TPS network does have its uses in the right scenarios with specific trust models, but the market really has no appetite for such nuanced descriptions,” Pu explained.
Pu indicates that blockchain projects with extreme claims on single metrics like TPS may have compromised decentralization, security, and accuracy.
“Take TPS, for example. This one metric masks numerous other aspects of the network, for example, how was the TPS achieved? What was sacrificed in the process? If I have 1 node, running a WASM JIT VM, call that a network, that gets you a few hundred thousand TPS right off the bat. I then make 1000 copies of that machine and call it sharding, now you start to get into the hundreds of millions of ‘TPS’. Add in unrealistic assumptions such as non-conflict, and you assume you can parallelize all transactions, then you can get “TPS” into the billions. It’s not that TPS is a bad metric, you just can’t look at any metric in isolation because there’s so much hidden information behind the numbers,” he added.
The Taraxa Co-founder revealed the extent of these inflated metrics in a recent report.
The Significant Discrepancy Between Theoretical and Real-World TPS
Pu sought to prove his point by determining the difference between the maximum historical TPS realized on a blockchain’s mainnet and the maximum theoretical TPS.
Of the 22 permissionless and single-shard networks observed, Pu found that, on average, there was a 20-fold gap between theory and reality. In other words, the theoretical metric was 20 times higher than the maximum observed mainnet TPS.
Taraxa Co-founder finds 20x difference between the Theoretical TPS and the Max Observed Mainnet TPS. Source: Taraxa.
“Metric overestimations (such as in the case of TPS) are a response to the highly speculative and narrative-driven crypto market. Everyone wants to position their project and technologies in the best possible light, so they come up with theoretical estimates, or conduct tests with wildly unrealistic assumptions, to arrive at inflated metrics. It’s dishonest advertising. Nothing more, nothing less,” Pu told BeInCrypto.
Looking to counter these exaggerated metrics, Pu developed his own performance measure.
Introducing TPS/$: A More Balanced Metric?
Pu and his team developed the following: TPS realized on mainnet / monthly $ cost of a single validator node, or TPS/$ for short, to fulfill the need for better performance metrics.
This metric assesses performance based on verifiable TPS achieved on a network’s live mainnet while also considering hardware efficiency.
The significant 20-fold gap between theoretical and actual throughput convinced Pu to exclude metrics based solely on assumptions or lab conditions. He also aimed to illustrate how some blockchain projects inflate performance metrics by relying on costly infrastructure.
“Published network performance claims are often inflated by extremely expensive hardware. This is especially true for networks with highly centralized consensus mechanisms, where the throughput bottleneck shifts away from networking latency and into single-machine hardware performance. Requiring extremely expensive hardware for validators not only betrays a centralized consensus algorithm and inefficient engineering, it also prevents the vast majority of the world from potentially participating in consensus by pricing them out,” Pu explained.
Pu’s team located each network’s minimum validator hardware requirements to determine the cost per validator node. They later estimated their monthly cost, paying particular attention to their relative sizing when used to compute the TPS per dollar ratios.
“So the TPS/$ metric tries to correct two of the perhaps most egregious categories of misinformation, by forcing the TPS performance to be on mainnet, and revealing the inherent tradeoffs of extremely expensive hardware,” Pu added.
Pu stressed considering two simple, identifiable characteristics: whether a network is permissionless and single-sharded.
Permissioned vs. Permissionless Networks: Which Fosters Decentralization?
A blockchain’s degree of security can be unveiled by whether it operates under a permissioned or permissionless network.
Permissioned blockchains refer to closed networks where access and participation are restricted to a predefined group of users, requiring permission from a central authority or trusted group to join. In permissionless blockchains, anyone is allowed to participate.
According to Pu, the former model is at odds with the philosophy of decentralization.
“A permissioned network, where network validation membership is controlled by a single entity, or if there is just a single entity (every Layer-2s), is another excellent metric. This tells you whether or not the network is indeed decentralized. A hallmark of decentralization is its ability to bridge trust gaps. Take decentralization away, then the network is nothing more than a cloud service,” Pu told BeInCrypto.
Attention to these metrics will prove vital over time, as networks with centralized authorities tend to be more vulnerable to certain weaknesses.
“In the long term, what we really need is a battery of standardized attack vectors for L1 infrastructure that can help to reveal weaknesses and tradeoffs for any given architectural design. Much of the problems in today’s mainstream L1 are that they make unreasonable sacrifices in security and decentralization. These characteristics are invisible and extremely hard to observe, until a disaster strikes. My hope is that as the industry matures, such a battery of tests will begin to organically emerge into an industry-wide standard,” Pu added.
Meanwhile, understanding whether a network employs state-sharding versus maintaining a single, sharded state reveals how unified its data management is.
State-Sharding vs. Single-State: Understanding Data Unity
In blockchain performance, latency refers to the time delay between submitting a transaction to the network, confirming it, and including it in a block on the blockchain. It measures how long it takes for a transaction to be processed and become a permanent part of the distributed ledger.
Identifying whether a network employs state-sharding or a single-sharded state can reveal much about its latency efficiency.
State-sharded networks divide the blockchain’s data into multiple independent parts called shards. Each shard operates somewhat independently and doesn’t have direct, real-time access to the complete state of the entire network.
By contrast, a non-state-sharded network has a single, shared state across the entire network. All nodes can access and process the same complete data set in this case.
Pu noted that state-sharded networks aim to increase storage and transaction capacity. However, they often face longer finality latencies due to a need to process transactions across multiple independent shards.
He added that many projects adopting a sharding approach inflate throughput by simply replicating their network rather than building a truly integrated and scalable architecture.
“A state-sharded network that doesn’t share state, is simply making unconnected copies of a network. If I take a L1 network and just make 1000 copies of it running independently, it’s clearly dishonest to claim that I can add up all the throughput across the copies together and represent it as a single network. There are architectures that actually synchronize the states as well as shuffle the validators across shards, but more often than not, projects making outlandish claims on throughput are just making independent copies,” Pu said.
Based on his research into the efficiency of blockchain metrics, Pu highlighted the need for fundamental shifts in how projects are evaluated, funded, and ultimately succeed.
What Fundamental Shifts Does Blockchain Evaluation Need?
Pu’s insights present a notable alternative in a Layer-1 blockchain space where misleading performance metrics increasingly compete for attention. Reliable and effective benchmarks are essential to counter these false representations.
“You only know what you can measure, and right now in crypto, the numbers look more like hype-narratives than objective measurements. Having standardized, transparent measurements allows simple comparisons across product options so developers and users understand what it is they’re using, and what tradeoffs they’re making. This is a hallmark of any mature industry, and we still have a long way to go in crypto,” Pu concluded.
Adopting standardized and transparent benchmarks will foster informed decision-making and drive genuine progress beyond merely promotional claims as the industry matures.