Christian Anderson, projected 16th pick in the upcoming NBA Draft, slipped on a glass floor in Kansas City on March 12 and walked away with a groin injury. That one moment ended the Big 12's experiment faster than any press release could. Commissioner Brett Yormark pulled the ASB GlassFloor system overnight, hardwood was back for the semifinals, and the sports internet spent 48 hours dunking on a guy who used to work at Roc Nation for trying to put LED panels under college basketball.
I get the reaction. I also think everyone is drawing the wrong conclusion from a sample of roughly 4 games.
What the Numbers Actually Tell Us
The conference paid $185,000 for the system, which covers installation, labor, and on-site tech support. That is not a casual purchase. The NBA tested the same ASB platform at the 2024 All-Star Weekend in Indianapolis. Euroleague teams have used it. The surface uses ceramic dots for grip and LED panels underneath for real-time graphics. None of that is vaporware.
What the Big 12 lacked was anything resembling a proper reliability model. In stats terms, you want your confidence interval tight before you deploy something in a high-leverage situation. The glass court got the equivalent of a 4-game sample at maximum stakes, then got pulled when variance bit them. That is not evidence the technology fails; it is evidence the conference had no systematic grip data, no friction coefficient baselines across different shoe types, and no load testing under tournament-intensity stop-and-start movement.
Grant McCasland, Texas Tech's coach, said it clearly: the guard quickness and stop-start action "just has a different response." That is a measurable variable. Somebody should have measured it before March.
One Injury Is Not a Trend Line
The fair point for the other side: Anderson got hurt, and draft stock is not something you gamble with on an untested surface. That is a legitimate concern and Yormark was right to pull it once player safety became the operative question.
But statistically, we cannot diagnose a technology from a single injury event. Players slip on hardwood every week, including in the same tournament rounds. One opponent to the glass court even acknowledged similar slips happen on regular floors. The injury is a signal worth logging, not a verdict worth enforcing.
The CBS Sports analyst called it correctly when he said the technology "eventually will be the future," even while noting that players who would typically embrace new tech were pushing back. That tension is the actual story. The glass court was not rejected because it cannot work. It was rejected because nobody ran the experiment properly before the experiment counted.
Here is my honest tension with my own argument: I am asking for more data before drawing conclusions while simultaneously drawing a conclusion from limited data that the Big 12 drew conclusions from limited data. I notice the irony. The difference is I am arguing for process; they skipped it entirely.
The right move is not to shelve the glass court and write a think piece about tradition. The right move is for the Big 12, or any conference willing to absorb some embarrassment, to run a proper off-season trial with multiple shoe brands, surface humidity conditions, and movement pattern testing. Get 50 players on it across different position types. Measure slip frequency. Compare it to hardwood's baseline. Publish the findings.
The technology works at low stakes. The question is whether it can be made to work at high ones. That answer lives in a spreadsheet, not in Yormark's overnight floor swap.