Athlete Management Systems Don’t Manage Anything
When you imagine an athlete management system, you probably visualize yourself looking at a series of dashboards. These dashboards aggregate data from a variety of sources to tell you everything you might want to know about your athletes—performance, health, body composition, etc.
But do you ever imagine the next step beyond looking at the dashboard?
What decisions are informed by the data you collect? How do your training protocols change based on the data? Can you actually connect the dots between information, action, and outcomes? Do these systems directly affect what happens in the weight room or on the field?
Usually not. Which is why the term “athlete management system” is really somewhat of a misnomer. They don’t actually manage anything. You, the strength & conditioning professional, are the one managing the athletes. You must take all the information and implement the necessary changes.
Nonetheless, we are dazzled by fancy graphs and neatly organized charts. We’re fooled into thinking that the mere possession of data will help us better train our athletes. We hope that our “experience” will help us know what to do, or that the nebulous concept of “actionable information” will reveal itself. Our sophisticated data collection lulls us into thinking that we are being scientific, but really we are not.
Before investing in any type of athlete management system or data collection effort, you must first consider three important facts:
- Data alone does not lead to better decisions.
- Data alone does not lead to better or more efficient execution.
- Data alone does not lead to better outcomes.
You can have a completely comprehensive dataset on your athletes, but the connection between information, execution, and outcomes doesn’t just magically happen. It takes logical decision making protocols and effective interventions to bridge the gap.
BEFORE YOU MEASURE…
Many coaches start by focusing on measurement. They determine what variables they want to measure, then set out to find solutions for collecting those measurements. Sometimes coaches come across a spiffy new tool, then go searching for problems they might solve with it. In either case, they’re putting the cart before the horse.
But before you even measure or select your tools, you need to know what you’re building. This might seem obvious in the context of building a treehouse or a deck, but it is often overlooked when building a system for training athletes.
A common mistake by professionals (in every field) is collecting data before they have laid out exactly what they hope to accomplish. Someone might make the claim that “more data” from an athlete management system helps you “make better informed decisions on health, safety and performance.” Ok, that’s a nice statement. But how? What are the specific pathways that lead from data capture, to better decisions, to improved outcomes?
In the book, How to Measure Anything, Douglas W. Hubbard lays out some important questions that you should ask before trying to measure anything.
What is the decision you’re trying to make?
In order for data collection to be valuable, it must influence how you operate. If you can’t point to a specific decision that might be affected by a given data point and how it might change your course of action, then the data is worthless.
As a strength & conditioning professional, you’re usually making choices about exercise selection, training frequency, training intensity, training volume, and rate of progression. But we need to clearly define what these decisions are all about within the context of our training goals. For example, how is training intensity decided? Is it calculated using percentages of 1RMs? Is it velocity? Is it a split time based on a PR? Answering these types of questions up front makes it much easier to know exactly which data points you should be collecting.
What variables are relevant to the decision, and how do they inform your choice?
Once you’ve clearly mapped out the important decisions, you need to determine the relevant variables. This is where you decide what actually needs to be measured and reported on. You can then focus your efforts on collecting data that matters, then ignore the rest.
Each variable you measure should map to a specific decision, and you should have pre-defined formulas or logic that defines how a choice is made. You might have a Fat Free Mass Index threshold to dictate whether a player focuses on weight gain or power endurance. Or perhaps you measure the 2000 meter row so you can use it to prescribe split times. In both cases we know exactly how to compute a decision outcome using the variable. We would not measure either of these variables just because we want to have the data on hand.
How much do you know now? (Usually more than you think.)
It’s important to remember that measurement is about reducing uncertainty. The goal is not to obtain absolute certainty (which is impossible). You don’t have to know everything in order to take action. Sometimes you don’t even need direct measurements, because you can often make accurate inferences based on indirect observations.
You might be tempted to think that the higher your uncertainty, the more data you need to collect. But in fact, the opposite is usually true. “If you know almost nothing, almost anything will tell you something.” And in many cases, knowing just a little bit goes a long way.
For instance, suppose we wanted to measure athlete fatigue. We could directly measure fatigue with wearable technology. But perhaps our athletes already know a great deal about their level of fatigue. If we simply asked them how they feel, we could potentially get all the data we need, even though it’s not totally complete or directly observed.
What is the value (and cost) of more information?
After evaluating what is currently known, the value of additional information must be determined and compared with the cost of obtaining it. Answering this question helps us decide whether or not it’s even worth collecting data about a given measurement.
The cost of information is usually pretty clear. It’s the man hours needed to manually collect and analyze data, the cost of software or hardware needed for measurement, or a combination of both. Calculating the value of information is not always so straightforward, and a discussion of the topic is beyond the scope of our discussion here. Instead, we’ll focus on a couple high-level concepts that can serve as guiding principles.
First, data collection is subject to the law of diminishing returns. In simple terms, this law states that the incremental value of something decreases as you obtain more of it. As you get more and more information, its incremental value approaches zero. Inversely, information becomes more expensive to collect the more detailed you get.
Second, as you can see in the chart above, the most valuable information comes at the beginning when you know very little—and it’s cheap to get. But you don’t even need a lot of information to get a lot of value. In fact, you should never embark on a major data collection initiative if you have a lot of uncertainty. You’ll most likely waste time and resources getting insights you don’t need, or could have gotten with much less effort.
Ultimately, if the cost of acquiring the data does not exceed the value of the information gleaned, then you should not bother with any measurements.
FROM MEASURING TO EXECUTING
Ok, you’ve done your homework about the decisions you need to make and the data you need to support them. Now you need to create processes to efficiently execute those decisions. This is the critical link between data collection and your desired outcomes. You have to build data-driven decision making into the system.
Is this where athlete management systems with their charts and dashboards come into play? Maybe. Ideally, you’d have a group of charts or data points that you can access without much hassle. You would be able to look at each one of these data points when faced with a decision and make an informed choice.
This simple process works fine if you’re making decisions on a monthly or even weekly basis. For example, each month you might evaluate indicators of strength and endurance, then use that info to determine the upcoming month’s training focus. However, if you need to make data-driven decisions on a day-to-day basis, looking at several charts each day becomes rather inefficient. You can’t evaluate readiness scores for a team of 20+ athletes, then make adjustments to their individual routines day in and day out. That’s just not scalable.
Automated decision making
The most obvious solution is to automate these types of decisions. Many coaches turn to spreadsheets to solve this problem. If you’re savvy enough, you can build out a complex spreadsheet that takes inputs from a number data points and produces the necessary outputs. Sometimes you can link your spreadsheet to various data sources so that it all happens seamlessly. It takes a bit of work to get the various technologies talking to each other, but it can be done. The primary constraint here is usually your time and willingness to tinker with technology.
Unfortunately, strength & conditioning professionals are often overburdened as it is. They are responsible for the health and performance of many athletes at once, so it’s a battle to find time for improving processes and becoming more efficient. It usually happens incrementally over a long time. Coaches build a new tool here, or improve an existing tool there. Over time, with collaboration and contributions of colleagues they work with along the way, they piece together something that resembles a cohesive system.
Technology is constantly improving, and it’s getting easier and easier for coaches to plug and play with different products. But if you organization doesn’t have the budget, you have to resort to more low-tech solutions.
A truly focused and well-thought-out data strategy will lead to better outcomes than an aimless, all-out data collection effort. Hands down.
To borrow again from Hubbard’s book, a good data collection and measurement strategy consists of the following steps:
- Define the decision and relevant variables.
- Determine what you know now.
- Determine the value of additional information.
- Measure the high-value variables.
- Make a decision and act.
The more you can automate these steps for repetitive decisions, the more efficient your organization will run.
Whatever your situation, just do the best you can. Don’t waste precious resources on data collection that has no bearing on how you operate. Concentrate your efforts on measurements that will truly make an impact on the quality of your training. If you can’t point to a specific intervention or training protocol that is altered by a given data point, ignore it. Humans only have so much capacity to manage the complex interplay between variables. So don’t bite off more than you can chew.