2015 Most Wanted Driver Test: Q&A
Drivers

2015 Most Wanted Driver Test: Q&A

2015 Most Wanted Driver Test: Q&A

Your Questions Answered

For the most part we’ve remained quiet throughout the publication of the results of our 2015 Most Wanted Driver Test. Rather than respond immediately as we usually do, we wanted to take some time to personally unwind, while allowing all of you to take it in, review the results, and of course, ask questions.

We’ve gone through and pulled some of the most frequent and most interesting questions from our 4 days of content so far. Obviously we can’t answer everything, but hopefully this will address the majority of what you guys have asked.

Instead of beating around the proverbial bush, I’m going to start with the most frequently asked question of them all.

How is the E8 Beta not in the Top? / How is the G30 in the Top 5?

2015-Most-Wanted-Driver-104

This is the internet, so we expect the most common theories will be conspiracy theories, but when it comes to the case of why Tour Edge’s E8 Beta didn’t make our Top 5 and why PING’s G30 did, the explanation is fairly mundane.

This is textbook case of opposite sides of the same coin.

With respect to the E8 Beta, obviously it did very well for Peak Distance, which I should note, is not the same as our Average Distance calculation. It also fared well for accuracy (the shorter shaft certainly influenced that).

This year we chose not to dedicate a day to forgiveness, but if we had, you would have seen a club that ranked near the bottom from a forgiveness standpoint. It’s a 440cc. There’s an MOI cost that comes with that.

E8 is long (peak shots), it’s straight, but on a relative basis, it’s not very forgiving.

When we looked at the ranges within each performance category, we saw roughly a 12 yard difference in distance (similar in peak distance), 10 yards in our consistency measurement, and less than a 5 yard difference between the best and worst clubs for accuracy.

On a comparative basis, accuracy’s contribution to the overall score was minimal (because the data showed comparably minimal differences between clubs), so as a result of the below average forgiveness rating, the E8 Beta dropped in our overall rankings.

With the G30 what we saw was a club that was near the top for average for distance, not far off for peak distance, and well inside the average range for accuracy. All of that along with the highest forgiveness rating in the test was enough to put the G30 into the top group.

In hindsight, maybe we should have lumped forgiveness in with accuracy, or dedicated a day to it.

A quick word about accuracy vs. forgiveness…

For those that are still struggling with the idea that accuracy and forgiveness are either closely related, or the same thing, they’re not. From a design perspective, accuracy comes from face angle and offset choices. It comes from roll radii, and it comes from internal weighting that further influences the rate at which the face closes during the swing.

Forgiveness is rooted in MOI and face technologies that help further preserve ball speed and other launch parameters on off-center hits. It has very little to do with accuracy.  In very simple terms, accuracy is left to right, forgiveness is front to back.

More of Your Questions

Have you ever thought about…?

Maybe…Probably. Even at the expense of year to year consistency we’re constantly refining the way we collect and analyze data. It’s an ever-evolving process, and some of the changes we make come directly from your feedback. Let’s just say we’re working on some things.

What happened to truAccuracy?

Part of the plan for 2015 was to simplify as much as we could. Quite frankly, some of our readers never quite grasped truAccuracy so we thought it made sense to put accuracy in terms that everyone understands (yards offline). Based on the feedback, I think we’re probably going to bring it back as our standard measurement of accuracy.

Why didn’t you fit your testers for the best Bridgestone ball for their swing? / Why didn’t you use <insert golf ball name here>?

We actually considered fitting guys for the golf ball prior to testing, but in any test like this you want to eliminate as many variables as you possibly can. Having multiple balls in play would introduce more variables and we knew we didn’t want to do that.

We chose the B330-RX because it’s the bestselling ball in Bridgestone’s lineup.

Can you describe the test environment?

As you should know by now, we use Foresight GC2 launch monitors to collect ballflight data, and the Foresight HMT attachment to collect clubhead data. All testers hit Bridgestone B330-RX golf balls. Balls are inspected frequently and replaced if they show any signs of wear/damage.

We test indoors using Foresight’s latest FSX software to project our test environment and provide a realistic presentation of ballflight.

We could test outdoors, but then changing weather conditions as well as how our testers react to those conditions would also become variables.

Something, something, ROBOTS

Every year. We’ve covered it.

Something, something, SHAFTS

Every year.

Not only does that not come remotely close to representing how the average consumer purchases clubs (the average golfer buys a club, not a head and a shaft) it’s not going to tell us much.

Part of the reason why manufacturers use different shafts is that different shafts will achieve different results in different heads. It’s about producing the desired launch characteristics for the target golfer. If we consider this from the extremes…the shaft that would work well for me in a SLDR would likely produce poor results in a G30.

You want the best combo, go see a fitter. We’re all for that, but for an off-the-rack test, not only is putting the same shaft in everything not relevant to the consumer buying experience, it’s going to produce data that’s equally irrelevant.

Why was the Callaway XR not included?

2015-Most-Wanted-Driver-102

As we discussed when we announced the test, Callaway declined to participate. With the help of our readers we were able to purchase the necessary allotment of Big Bertha Alpha 815, Big Bertha Alpha 815 Double Black Diamond, and Big Bertha V-Series drivers.

At the time we started our test, however, XR was not yet available at retail, and with Callaway unwilling to provide samples, we had no option but to exclude the XR line.

What lofts/specs were the clubs?

2015-Most-Wanted-Driver-107

I don’t think we dove into this as deeply as we had in year’s past. This year we asked manufacturers to send a good mix of lofts, flexes, and where applicable stock shafts, so that we could provide the best fit for each tester.

For us, it’s about getting the best possible result with each club. For the most part guys do stay in the same loft and flex, but there are occasions when we need to increase or decrease loft and/or change flex.

Certainly we’ve considered moving to a single loft or a 9.5/10.5 test, but we’re not sure that does much to minimize the variables:

  • Some models are 9°, others are 9.5°
  • How much loft is really on a given 9.5° driver? How much is on that other one?
  • What about adjustability? Do we just set it in the neutral position and ignore the options? Gravity Core? FlipZone?

Manufacturers have provided us with a wealth of options, and during our tests we leverage each of them to fullest extent possible to get the best possible results with the available technology.

How do you go about testing clubs like the Callaway Big Bertha Alpha 815 Double Black Diamond and Cobra Fly-Z+ that have two center of gravity options?

2015-Most-Wanted-Driver-106

Had we more time, it would have been interesting to treat each of them as two different clubs. Since we couldn’t do that, as with all of the clubs in the test, we tried to get the best fit from whatever options were at our disposal. In the specific case of FLY-Z+ we had roughly a 60/40 split with the majority playing the weight back.

With gravity core, the majority (18/20, I believe) tested with the core down. For those who needed it, flipping the core up increased spin (and some testers needed it) and had only a minimal impact on launch angle.

When the slowest swing speed you’re testing is 91.2 MPH, then you are leaving out a huge segment of average golfers…

ron-testing

We definitely understand what you’re getting at. Worth a mention, the actual lowest swing speed in our test was in the 78 MPH range. We had a couple of others in the low 80s as well, but yet, even our sub-100 swing speed group is likely above average.

We’d certainly like to better represent the slower swingers, but it becomes an issue of fatigue. It’s difficult to find sub-80 swingers who can hold up for the duration of a testing session.

Your Data Doesn’t Make Sense

First of all, the data is the data. It’s what happened. We don’t tweak it, massage it, or otherwise alter it. Other than dropping outliers and calculating averages and standard deviations from what’s left, it is as it came off the launch monitor.

In most cases the presumed oddities are easily explained. One reader mentioned the similarity in the spin numbers between G30 and G30 LS. LS should spin less, right? It’s simply physics, he said.

It’s true. Loft for loft, LS should spin more, but since we do adjust the clubs you have to consider that a segment of testers may have needed more loft with LS. They may have needed a face angle adjustment. They might have had better results with a different shaft. All of these things can and do impact the numbers. We’re willing to tolerate that to get the best results we can on an individual basis.

In other cases, the results aren’t what you anticipate. It happens.

ellipses

I bounced some data that wasn’t what I expected it to be off an R&D contact who, for my money, is one of the smartest guys in the golf industry. I’m paraphrasing here, but basically what he said is:

“You understand the physics to a certainty. You know what should happen, but you never know how the golfer is going to respond to those physics. Very often what happens isn’t what you think should happen”.

I’ve seen that sort of thing enough where I don’t obsess over 200 RPM of spin (that could be as simple as manufacturing tolerances), but when I see things like pronounced changes in Angle of Attack between clubs it’s definitely eye-opening – especially when those changes creep outside one standard deviation.

Angle of Attack, as one reader pointed out, should be dictated entirely by the golfer. In fact, it’s dictated by the golfer and how he responds to a given club.

What we’re quickly learning is that everything impacts everything else.

Cobra had this up on their site fast, did you give them a heads-up?

fly-Z-most-wanted

Yes. As we did with TaylorMade last year, and Callaway the year before, we gave the winner a few days’ notice to do whatever they choose to do with it. It’s also worth mentioning that, unlike the Hot List, companies whose products receive accolades from MyGolfSpy are not required to pay any licensing fees for use of badges, logos, or anything else related to Most Wanted.

Why Didn’t You Include ______ in the data?

Actually, that’s a paraphrased aggregation of several questions related to the presentation of data. Certainly some of the proprietary stuff we like to keep close to the vest, but mostly the goal was to keep it simple enough that the average reader’s eyes wouldn’t gloss over. With that in mind, we limited the data we published to simple averages of the data as it came off the launch monitor. Most everyone can wrap their heads around that.

That said, I was particularly interested in a comment by Johannes N.. Basically the idea would be for us to provide enough data for the reader to assign his own values to Most Wantedness (love that pharse). As Johannes pointed out, to really dig in you’d need to see standard deviations in addition to the raw data.

data-cap

Quite frankly, it never occurred to us that so much as a handful of you would want to dig in at that level, but I think it’s pretty awesome that you would. That’s data we can definitely include in the future.

From a statistical standpoint, in any given category we had 3 to 6 clubs (with ties considered) that fell outside of one standard deviation. While I won’t say it will never happen, with this many clubs and this many testers, I don’t think we’ll find many that pass the 2 sigma mark.

To Johannes’ larger point about allowing each readers to determine what’s Most Wanted for his individual game…this is exactly what we’re working towards. We’ll have our system, but we will also provide you with the data and the tools to draw the conclusions most relevant to your game.

That’s A Wrap

This more or less closes the book on our 2015 Most Wanted Driver Test. We’ve got some really exciting stuff in the pipeline, so stay tuned as we’ll be making an announcement soon.

For You

For You

Golf Technology
Apr 25, 2024
Skillest is Reimagining Golf Instruction
Golf Bag Carts
Apr 25, 2024
Forum Member Review: Clicgear Model 4.0 Golf Push Cart
News
Apr 24, 2024
Are You Wearing the Right Size Shoe?
Tony Covey

Tony Covey

Tony Covey

Tony is the Editor of MyGolfSpy where his job is to bring fresh and innovative content to the site. In addition to his editorial responsibilities, he was instrumental in developing MyGolfSpy's data-driven testing methodologies and continues to sift through our data to find the insights that can help improve your game. Tony believes that golfers deserve to know what's real and what's not, and that means MyGolfSpy's equipment coverage must extend beyond the so-called facts as dictated by the same companies that created them. Most of all Tony believes in performance over hype and #PowerToThePlayer.

Tony Covey

Tony Covey

Tony Covey





    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

      jungle treasure 2

      9 years ago

      Good day! I could have sworn I’ve been to this website before but after checkiing
      through some of the post I reallized it’s new to me. Anyhow, I’m definitely happy
      I found it and I’ll be bookmarking and checking back frequently!

      Reply

      Ecfthegolfguy

      9 years ago

      Meant fly Z

      Reply

      Ecfthegolfguy

      9 years ago

      Based on the info I traded in my tailor-made aero burner for The cobra why see my swing speed is between 89 and 91 what a difference at least 20 yards longer!

      Reply

      andrew

      9 years ago

      i for one come here for the data overload. this is stuff you just don’t get anywhere else. why not put up the short version for the other folks, and then a longer version just below for the rest of us to nerd out on?

      Reply

      Pond Rocket

      9 years ago

      Comparing 2015 to 2014, the 2015 group as a whole appears to have gone 10 yards further. The only driver that was tested in both years was the Callaway Big Bertha Alpha. In 2015 it went about 12 yards further than in 2014. I am wondering what might account for the difference. Was a different golf ball used? I don’t see any reference to the ball used in 2014. This also raises the question of whether the 2015 drivers are truly longer as all of the manufacturers would like us to believe.

      Reply

      proside

      9 years ago

      Kudos for the hard work. I find a lot of the data descriptions hard to follow which leaves me wondering what to interpret and how it’s relevant to me. I’m certain that fitting each tester is beyond practical. I think the stock shafts is the right decision for testing. I feel there are 4 types of hitters or perhaps 2 types with an A or B result. Those that hit fairly upward and those that are more level or even down. Those that have piercing shots or lofting shots. My point is that any of us could identify with those or be identified with those and that the clubs available for testing would end up ranked very differently for each of those strike types. There aren’t any RH clubs that work well for a lefty if you get my meaning. Frankly I feel that your testers are better off grouped into echelons relative to centre strikes rather than ball speeds.

      Reply

      Repo

      9 years ago

      Hi,

      First, congratulations on a great effort that seems to have occurred outside of marketing hype and a world in which a true unbiased review is rare.

      That being said, I’d like to see the data. Of course statistical data spread across the qualitative categories that you’ve assigned will always be subject to bias. It’s not an accusation, it’s a scientific reality.

      So, provide the raw data and see if there are a few amongst us who can perhaps sharpen it further to provide even more cogent an argument for a certain club choice. For example, COR spread across the 2D face at given baselined swing speeds (and impact angle of attack, etc.). If I tended to miss to the toe side when I get tight, a more forgiving COR spread towards the toe might help. OR I could just fix my crappy swing, but that’s a bigger problem…

      Anyway, food for thought.

      Repo

      Reply

      JP

      9 years ago

      Re: Robot testing.

      I realize your stance re: this. I get that folks are interested in a golf “club” as the package, and that most folks won’t take the time to find the best shaft/head combo for their individual swing.

      That said, to me, it would still be interesting to see robot testing, but in a greatly enhanced way of doing it.

      Example: Using the same exact model shaft, try the different heads on them and have the robot hit the ball in the center of each head, w/ same SS and AOA and all that jazz. Report results.

      Using that same criteria, switch up the AOA for all heads. Report results.
      Using that same criteria, switch up the SS for all heads. Report results.
      Using that same criteria, switch up both AOA and SS for all heads. Report results.
      And on and on and on and on.

      Have the robot hit balls off each head’s toe section. Report results. Balls off the heel section, etc etc etc.

      Then, heck, use a different “universal” shaft for the tests and do it all over again w/ that new shaft w/ all the heads.

      The scenario I’m going w/ here can lead to literally hundreds of thousands of different tests w/ different variables. But to me, it’s the only way to really find out the truly best shaft/head combo at “xx” SS and “xx” AOA.

      Reply

      Chad

      9 years ago

      I know I learned one thing for sure during this test!! Tour Edge users have a cult like following!! They remind me of Apple/iPhone users or Starbucks drinkers.. They think their s&%t doesn’t stink and there’s no way their product is not the best!! Look, it lost deal with it!! Go to another less informative site if u don’t like these results (which have data to back them up).. Stop whining for god sakes!!

      Reply

      Brian

      9 years ago

      Forget the data, I would like to see which driver each of the testers would choose for themselves.

      Reply

      Gene

      9 years ago

      Good work again guys. I own a G30 and after reading your test I bought a fly-z and I am some longer. Your work is the reason I made the contribution earlier in the year.

      Reply

      BR

      9 years ago

      Great series/reviews as always. I like the idea of producing an interactive data set in which we could assign our on variables. Thanks again for the testing. Looking forward to MGS future studies.

      Reply

      Peter

      9 years ago

      Great series as always,

      Really missed seeing Cleveland compete. But overall, the thing I would like to see come back is the commentary on each of the top sections. I used to really look forward to reading that. part

      Reply

      Max

      9 years ago

      Srixon = Cleveland

      Same company.

      Reply

      Nick

      9 years ago

      I understood last years testing format. This year i understood that the E8 was long and accurate, to me that implies forgiving whether it should or not. Unfortunately i think the testing format took a step in the wrong direction in that i almost was led to buy a club that wouldn’t have helped my game at all… which is the only reason i would consider the data.
      I can’t help but think that next years data publication will fall on blind eyes, for me. Just being honest.

      Reply

      McaseyM

      9 years ago

      Would you consider actually hitting the club first? I think stating that you almost bought the wrong club for your game based just on this test is pretty ludicrous, these results should be a starting point to go demo clubs. Even if you’re not gonna get fit, buying a club blindly without hitting it on a range or at least on a pre-primed launch monitor in a store would be your fault, not MGS’s.

      Reply

      Bobby c

      9 years ago

      Data.Hype vs cold hard facts. I commend you for spending so much time gathering it. You know how far and how far off line each shot went. Since you have it can’t it be looked at and you do a page on Most Wanted Forgiveness ? With only a 5yd difference in accuracy 1/2 of the results seem useless. Has the techno speak gotten so in the weeds we’re not seeing the forest through the trees ? We know what driver hits it the farthest, I’d like to know which one helps me the most (furthest and least off line) if I don’t hit it in the screws. thanks. Lastly, how about looking at the data (how far/how far off line) and comparing clubs from this year’s and last year’s test ? You have the data, doesn’t matter the testers were different people.

      Reply

      CJ

      9 years ago

      Forgiveness data — please share!
      You make a strong case for the importance of forgiveness. No disagreement here. But why not share something — anything — by way of forgiveness data. How about ordinal rankings, at the very least. Or better yet, some indication of the relative forgiveness of different drivers. Or best of all, the forgiveness data itself. Okay, maybe the data itself is too close to or actually is proprietary stuff. But I have to believe relative forgiveness of different drivers can be indicated in some manner, and certainly ordinal rankings based on forgiveness can be shared without revealing too much or requiring too much time or effort to publish. Please? Pretty please?

      Reply

      Stonedylan

      9 years ago

      We’ve seen a few years of Most Wanted Driver tests…could you do a head-to-head of the overall winners? The technological limits have been reached, as Crossfield says, this year they’re selling us on face forgiveness and miss hit distance. So if this years drivers aren’t being sold as longer, is it possible to have a test where we see the data and try to quantify the progression. Real data, not marketing. Thank you for the content, I preferred last years (low key) razzmatazz to this years earnest and sparse presentation but your dedication to improving the process is second to none and making others raise their game. (Mimicry is the sincerest form of flattery.)

      Reply

      Giles

      9 years ago

      Every year this test gets better and better with better data and thoughtful analysis. Great job this year!!! Research and results like this are why golfspy is the best golf equipment review site on the web.

      Reply

      Gordon

      9 years ago

      Thank you for answering the questions.
      You guys always do a great job and in my eyes, while I would always go hit the new gear myself and see what works best, I think this is absolutely a helpful guide and starting point.

      Thanks again MGS

      Reply

      revkev

      9 years ago

      Any chance that the upcoming announcements will include pitting the last three winning drivers against one another using the current set of testers?

      That’s the one I really want to see. Is there actually an advancement in technology and does that advancement make a difference for real golfers.

      As always great job.

      Reply

      Kenny B

      9 years ago

      Many thanks for the answers. Given that there are a significant range of tester swing speeds, I would like to see a summary of thoughts from the testers in a low, middle and high range range. I know when I test a club, it doesn’t take many swings for me to tell if it is a club I like. Is it possible to have a synopsis of the testers and their thoughts about specific club likes or dislikes?

      Reply

      MG

      9 years ago

      So the difference in accuracy (left to right dispersion) from the best to the worst club was less than 5 yards. Does that mean we shouldn’t put a lot of emphasis on accuracy? And if forgiveness is back to front, does that mean forgiveness factors a lot into average distance?

      Reply

      Drew

      9 years ago

      “Does that mean we shouldn’t put a lot of emphasis on accuracy?”

      No, accuracy is extremely important. What it does mean though is that accuracy has a lot more to do with the person and how they are individually swinging. Every club has the potential to be 100% accurate and precise if swung the exact same way every time.

      “And if forgiveness is back to front, does that mean forgiveness factors a lot into average distance?”

      The simplest answer is, for most people, yes. Forgiveness is really looking at “if I cannot hit the center of the club with every swing how much shorter will my ball travel?” So if hitting the center of the club is an issue then forgiveness will have a large impact on your average distance. However, if you hit the center of the face every time, the club’s forgiveness will not impact you.

      These are just my thoughts, but I hope they’re useful.

      Reply

      Steven

      9 years ago

      Thanks for the explanations. Keep up the good work.

      What is next on the agenda for testing?

      Reply

      Sharkhark

      9 years ago

      Whew! Allot of data, allot of info. Very thorough. I remember when golf magazines actually did a hot test & were fairly honest or critical, that’s long gone.
      Sites like yours will never please everyone there’s always a variable they’ll lament isn’t there.
      But overall? Your detailed & doing an admirable job.

      Reply

      Dave S

      9 years ago

      Thanks for all the hard work again guys. We may give you a hard time every once in a while, but MGS certainly has the best product on the market for golf reviews and it looks like it will be getting even better in the future.

      Will you be putting out a “Testers’ Picks” follow-up like last year?

      Reply

      josh

      9 years ago

      One thing that I have seen that I think is more telling about a driver heads design is how launch, ball speed, dispersion, and spin change relative to where a ball is hit on the face of the club. I’ve seen robot tests that show this for single club heads, but it would be awesome to see that kind of test with the lineup you put together.

      Reply

    Leave A Reply

    required
    required
    required (your email address will not be published)

    This site uses Akismet to reduce spam. Learn how your comment data is processed.

    Golf Technology
    Apr 25, 2024
    Skillest is Reimagining Golf Instruction
    Golf Bag Carts
    Apr 25, 2024
    Forum Member Review: Clicgear Model 4.0 Golf Push Cart
    News
    Apr 24, 2024
    Are You Wearing the Right Size Shoe?
    ENTER to WIN 3 DOZEN

    Titleist ProV1 Golf Balls

    Titleist ProV1 Golf Balls
    By signing up you agree to receive communications from MyGolfSpy and select partners in accordance with our Privacy Policy You may opt out of email messages/withdraw consent at any time.