×
Data & Innovation, Performance | Jul 7, 2020
A series of suggestions brought to you by the San Antonio Spurs and Wests Tigers.

Are you making the best of your data? The continuous process of working out where data can have its greatest impact in performance leads to inevitable stress-testing of your data models.


By John Portch

Xavi Schelling, the Director of Sports Science & Performance at the San Antonio Spurs and Andrew Gray, the High Performance Manager at Wests Tigers, have both worked across a series of sports organisations across a range of sports, some with more developed data and analytics programmes than others, but there are some tenets to which they hold true. 

The Leaders Performance Institute presents their five tips for ensuring you are still finding and using the correct tests and metrics as your sport continues to evolve. 

1. Be sure to iterate and reiterate 

At San Antonio, Schelling describes the cycle that sees performance processes continually challenged with a basketball department. “My main goal as a director of performance and sports science, is to ask the right questions to the right people. This applies also to the coaches, so the better the questions they ask us, the easier it is for us to build a better tool for them.” 

“You deliver something [a decision support system] that hopefully will increase the knowledge of the organisation, leading to new questions or optimising old ones.  That feedback is necessary to improve the next iteration of the system or tool. This dynamic process is something that has to be constant, not just for new reports or new tools but also to optimise old tools; and this is critical. 

“Don’t try to keep implementing new devices, new tools and new technology, also invest time in reviewing, ‘OK, I created this tool and implemented this device, we generated this knowledge, we are better and now there are more and better questions go back and optimise the same report or the same tool to use it more efficiently.’ Once this loop is exhausted, you have maximised the service of this tool or device, then you move on and think of implementing something else. But it’s critical to understand where the people are in terms of knowledge and needs and build from there.” 

2. Identify inconsistencies, keep the good, and discard the rubbish 

“The problem when moving between clubs,” says Gray, “is there isn’t necessarily consistency in how a test is being performed or exercises that are tested in the gym.” It can, in certain circumstances, make efforts to blend old and new datasets problematic. “Those can be troublesome and the easiest thing to do can be to not blend old and new datasets – but to throw it out and start again but then you’re missing the opportunity to learn from that.” It is a balance. “If there were four strength movements that were being tested in an old dataset and we have another four that are being tested now and only two of them are consistent then we’ll probably only carry through those two. There needs to be a fair level of consistency between tests if we’re going to try and extrapolate anything from it.” 

3. If you can, run older and newer solutions side by side 

Sometimes the challenge is not a change in device but a software or firmware update on an existing device. Nevertheless, as Schelling explains, the same questions remain. “The first question is whether or not updating that firmware or including a new device or updating the device is  really better than what you had?” he says, “first you should invest time in making sure that the vendor’s claims are true and that it’s not just a marketing tool. If the device or software really is better because it’s more precise and reliable, or because it brings new actionable information, then update or change as soon as possible. If it’s better than what you had in the past, then do not hesitate; acquire it and implement it as soon as possible. 

“Now, if you can afford it, overlap systems, meaning I have the old device and the new device and I use them both for a season, the old and the new at the same time. This way you can have a very accurate assessment of variability between old and new data and you can come up with solutions to that variation.” 

4. Try to balance the benefit of a tool against the drawbacks of its inaccuracies 

Gray is clear on the matter: “I think it’s important to understand the variability in any measurement and if we bring a new device into a system we spend a lot of time understanding that margin for error or that variability and if it’s too great then it just doesn’t fit.  

“But if we’re not hung up on a device that has a degree of variability, as long as we’re able to understand it because it may be that it enables us to capture data in an area where we might otherwise be unable to; and it may have some beneficial effects. Sometimes I think we may become too hung up on the fact that there is variability within this measurement and exclude that device from our system and throw the baby out with the bath water.  

“We also can’t go down the other end and be making big decisions on players and not looking to deeply understand the error measurements. I think there’s a middle ground there; and that middle ground is where we try to operate to get the benefits of using the technology, the benefits of the data and create a consistent language, but understand what those issues are and, in a sense, try to give confidence intervals and be open and discuss everything.” 

5. Be mindful that vocabulary can change along with a device 

The ‘language’ of data was discussed in a previous chapter and your team’s data vocabulary can shift with the adoption of a new device. “That’s the issue for teams, especially if they were using a device for a while and building really good language around that, then the device changes; all of a sudden the language changes,” observes Gray. “That’s a real issue in performance data.” Tread carefully when innovating and stress-testing your data modelling. 


Looking for more performance insight?

Performance 21 is available for download now and leads with a selection of insights lifted from our At Home With Leaders podcast series, which has featured the likes of England Rugby’s Eddie Jones, the Toronto Blue Jays’ Mark Shapiro, and Chelsea’s Emma Hayes speaking directly from their home offices.

Sign up to our newsletters

To ensure you’re keeping up to date with the latest intelligence, sign up (for free) to receive newsletters. Get exclusive insight from top organisations worldwide on leadership & culture, human performance, coaching & development and data & innovation.

Subscribe

Become a member

Join our exclusive community of 600 leading global performance organisations to target all aspects of high performance and challenge thinking with access to premium events, unlimited thought-leading content & networking opportunities. 

Become a member