Anyone know of any tech or studies in the sound of impact as it relates to where the ball made contact on the face of the club. With the huge strides that have been made in voice recognition over the years i.e. Siri, Alexa, Cortana etc., it seems reasonable that those voice recognition engines would be able to identify where on the club face a ball makes contact based on the sound that is generated at impact. The one road block I can think of is that every golf club sounds different. But do they sound similar enough that a universal library of impact sounds could be effective in matching face position to an approximate sound?
Announcement
Collapse
No announcement yet.
ultra sonic launch monitor technology
Collapse
X
-
I don't know of anything along those lines, but the other problem would be that different balls sound vastly different with the same club (urethane cover vs rock hard range ball). I think there would just be too many variables to contend with. I have briefly looked at triggering a camera to take a picture at the moment of impact or a series of pictures surrounding impact to try and calculate club path and face angle. I think a similar approach for impact location would be easier than sound analysis
-
I've fooled around with a few of those apps that trigger video recordings also. I found swing catalyst to be pretty disappointing considering the monthly price tag. Can't even use an iPhone as one of your cameras.
But I'm with you on the seemingly endless number of variables that could make the idea seem a bit far fetched. But my thinking is that, even with differences between say a balata ball and a 2 piece, the two sounds would be similar enough based on the strike that I think it could work. So, say you have a high toe strike with a ballot ball and then the same strike with a 2 piece ball. obviously, they are going to sound very different. But within the audio waveform, they will both have specific characteristics that are universal among strikes that are high toe. So, the software would have to look for that characteristic and id it as a high toe strike. If Alexa can understand you when your mumbling or slurring with a bit of a buzz on, I've gotta imagine it would be possible for similar applications to identify with a good deal of precision where on the face your striking it. I don't know. Maybe not. Just something that's been rattling around in my head lately. I feel like launch monitor tech is so drastically overpriced these days, that I'm always looking for ways to make some of that sort of tech more accessible.
-
-
Only ultra sonic equipment I ever worked with was for under water side scan which picked up some pretty clear detail on what we scanned with it, haven't seen anyone working on anything like that for out of water, not saying it isn't out there or can not be done, just haven't seen anything on it. I know it can be tweaked to see a lot under water, was not clear as a picture but clear enough to tell what it was. Now that sends out a signal and reads the bounce back, not sure if just trying to use the sound of an impact could really tell all that much, way to many variables to deal with it would seem.
Comment
Comment